Become a Patron!

NVIDIA GeForce GTX 1080 Ti Video Card Review @ [H]

Discussion in 'nVidia Flavor' started by Kyle_Bennett, Mar 9, 2017.

  1. Zarathustra[H]

    Zarathustra[H] Pick your own.....you deserve it.

    Messages:
    22,450
    Joined:
    Oct 29, 2000
    Wow, that's later than usual. They usually do 6am PST
     
  2. Killa|3yte

    Killa|3yte 2[H]4U

    Messages:
    2,267
    Joined:
    Dec 22, 2002
    On the BF1 section, the text above the 4K graph reads "2560x1440 1440p" instead of "4K" or whatever
     
  3. geok1ng

    geok1ng [H]ard|Gawd

    Messages:
    1,769
    Joined:
    Oct 28, 2007
    Doubt that their sales numbers can rival those of low cost 39"-40" VA 4k TVs
     
  4. SomeGuy133

    SomeGuy133 2[H]4U

    Messages:
    3,416
    Joined:
    Apr 12, 2015
    Prove me wrong. Show me the numbers of GPU sales per vendor from 1995-2000. Show me the market was 5 times larger than today because we had about 7 different manufactures back than give or take the period.

    2 vendors vs 7+/- means that there must be ~ 3 times more global sales than today per unit to claim economies of scales were the same. The GPU market was more units in 2005 than today but definitely not in 1995. Gaming was extremely small back than and there were 3x more vendors.

    http://www.techspot.com/article/650-history-of-the-gpu/

    you can do the calculations from this source and get an idea of how fewer the sales were between companies compared to 2005 and today. Anandtech has an article showing global sales 2005 to 2013 if you want to educate yourself. Granted this doesn't give you the needed data for the highest tear cards.
     
    Last edited: Mar 10, 2017
  5. Crosshairs

    Crosshairs Administrator Staff Member

    Messages:
    23,692
    Joined:
    Feb 3, 2004

    I'm not really sure you understand how this works...when you make a claim, its not my responsibility to prove you wrong.

    I honestly don't know the numbers, which is why I asked for your source. so either you can back up your claim, or admit you're talking out your ass...either way is fine by me.
     
    cybereality, 50Cal and Armenius like this.
  6. 0x4452

    0x4452 n00bie

    Messages:
    19
    Joined:
    Nov 19, 2016
  7. SomeGuy133

    SomeGuy133 2[H]4U

    Messages:
    3,416
    Joined:
    Apr 12, 2015
    I just posted the numbers jesus. You have to extract them and use several different sources which i referenced. Granted the by die/chip data doesn't appear to exist but you can see market wide data and see the difference in units and some basic math shows that 1995 numbers were substantially lower and 2005 was higher. now large die vs large die doesn't exist (to the public) as i stated...not sure why they hide 20 year old data.
     
  8. Crosshairs

    Crosshairs Administrator Staff Member

    Messages:
    23,692
    Joined:
    Feb 3, 2004
  9. Dan_D

    Dan_D [H]ardOCP Motherboard Editor

    Messages:
    50,322
    Joined:
    Feb 9, 2002
    Not a minute or so after I ordered mine HardForum members were posting that the NVIDIA store was showing "out of stock".
     
    Crosshairs likes this.
  10. Sparky

    Sparky 2[H]4U

    Messages:
    2,963
    Joined:
    Mar 9, 2000
    I am forcing myself to wait for the AIB's which I am hoping will offer better cooling solutions.
     
    SomeGuy133 likes this.
  11. Burzum

    Burzum n00bie

    Messages:
    26
    Joined:
    Dec 15, 2013
    Same here, I wish we had some more info about when the custom cards will be available, the stock blower just doesn't do it for me.
     
    lostin3d likes this.
  12. hyt3k9

    hyt3k9 2[H]4U

    Messages:
    2,211
    Joined:
    Aug 26, 2006
    Anyone else experiencing a shortage of supply in their area like I am? Both my local (within driving distance) MIcro Centers got like 5-6 each.
     
  13. SomeGuy133

    SomeGuy133 2[H]4U

    Messages:
    3,416
    Joined:
    Apr 12, 2015
    new GPUs are always out of stock for a few weeks
     
    Armenius likes this.
  14. pandora's box

    pandora's box [H]ardness Supreme

    Messages:
    4,103
    Joined:
    Sep 7, 2004
    Guru3D overvolts their 1080 Ti in their review. Though I think they just add +50mv to every card they test.
     
  15. DPI

    DPI [H]ardForum Junkie

    Messages:
    9,597
    Joined:
    Apr 20, 2013
    You can set your watch to people being surprised about stock shortages whenever a new Nvidia card launches.

    It's like the people that are shocked they can't get a reservation when a popular new restaurant opens up.
     
    Armenius, Mav451, Comixbooks and 2 others like this.
  16. heatlesssun

    heatlesssun Pick your own.....you deserve it.

    Messages:
    45,588
    Joined:
    Nov 5, 2005
    Sure, it's like that with hot tech. It's just that it's not normally hard to blow $1400 bucks
     
  17. 50Cal

    50Cal Limp Gawd

    Messages:
    165
    Joined:
    Aug 15, 2011
    I wanna know how that got 29C at idle temps. Mine hovers around 50C with really good airflow.
     
  18. N4CR

    N4CR [H]ard|Gawd

    Messages:
    1,723
    Joined:
    Oct 17, 2011
    Huh? It's the same damn tech from last year, just at a lower price.

    They did... as above.

    They sold the high profit chips first when everyone was getting excited about the 1080 with a small die, now the market is saturated so they're selling us the scraps they can't offload to neural nets and the rest. Practically the same thing happened last 2 generations.
     
    SomeGuy133 likes this.
  19. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,666
    Joined:
    Apr 17, 2000
    This IMO is a mistake, if you overvolt the GPU you end up pushing the card to the TDP maximum level quicker, and in the end can end up hurting your overclocking or keeping it from the highest it can be due to power throttling. The best way to overclock is to first overclock with no voltage modification, find out how high you can get it without voltage first, and look at the TDP level its at. If there is room, or just by trial and error you can then begin bumping up voltage slowly, in increments, and see if you can eek the clock higher. But every card is going to be overclockable to different levels of voltage, there isn't just one voltage increase that applies to all gpus, one card could cap out at 20% voltage, another at 30%, so you have to do it manually each and every time for the card in question. Sometimes you'll find the highest and most stable overclock without throttling is done without voltage at all. GPU Boost is variable, but if you try to push it too far and hit that power cap, the clock is going to throttle down after long periods of gaming. This is also why it's important to test the clock speed in gaming for about 30 minutes to make sure it doesn't drop down, or what it drops down to. This is how we do our testing. Memory overclock also affects the power draw of the card as well and can also cause the clock to throttle down if pushed too hard. There's a lot of variables to consider.

    When you look at our overclocking results and final stable overclock you can be assured the clock speed is maintaining that clock frequency consistently during long periods of gaming. We even show it to prove the data in our clock speed over time graph. The last thing you want is to overclock a card and think you are running a certain frequency, when in reality in games it really isn't after 10 minutes of gaming. A benchmark run, remember, is very short, a benchmark like 3dmark or something doesn't run long enough to see the real result of gaming for 30 minutes.
     
    Last edited: Mar 11, 2017
  20. pandora's box

    pandora's box [H]ardness Supreme

    Messages:
    4,103
    Joined:
    Sep 7, 2004
    Yeah I dont trust Guru3D's overclocking after I saw they added voltage in the 1080Ti review. believe they did the same in the 1080 and the Titan X reviews too. I usually go by HardOCP and PC Perspective for overclock results as you guys test over a long period.

    Review sites that just smack the fan speed to 100% and then overclock - pointless as its not realistic.
     
    ghostwich likes this.
  21. SomeGuy133

    SomeGuy133 2[H]4U

    Messages:
    3,416
    Joined:
    Apr 12, 2015
    I like it and would like to see higher power limits BIOS BIOS or software. Gigabyte software allows 130% which is super hard to push with 980TI I hit it once or twice with max software voltage on my 980TI with 1570 clocks.

    I am hoping AIB has 130% power and voltage control. I wold like to see 2100 constant if possible.

    1570 was only stable in Saint row 3 4K and PS2 was stable for like a couple hours.

    real stable clocks for whole session was 1550 with max 128% power spikes

    other games only stable at 1500 with 122-125 spikes with max voltage.

    Min was also like 65% ASIC lawls but somehow my total shit card can do 1500 on all games with 70-100% fan :D (2 years later its starting to show a little instability but i got 2 years out of it so whatever and not hard to alt tab reset overclock and go back to warthunder or whatever game. Now it crashes like once every few hours but wwhatever. alt tab click apply and Go To Go!)
     
    Last edited: Mar 11, 2017
  22. CSI_PC

    CSI_PC 2[H]4U

    Messages:
    2,096
    Joined:
    Apr 3, 2016
    That is in the overclocking section not normal benchmarking section of his reviews: http://www.guru3d.com/articles_pages/geforce_gtx_1080_ti_review,32.html

    If you check the review results to the 3 bars in the overclock section it can be seen he is doing reviews without changing voltages.
    Cheers
     
  23. pandora's box

    pandora's box [H]ardness Supreme

    Messages:
    4,103
    Joined:
    Sep 7, 2004
    I doubt we'll see cards with a custom bios. Don't think we ever got them with the regular 1080. Didn't really look into it myself but no one else been able to hack nvidia's security on the bios. Hell I'm not sure you can even flash the bios on Pascal cards.

    AND CSI_PC: Yeah I know. Guru3D reviews the cards performance in games and then he overclocks.
     
  24. PanzeR-

    PanzeR- Limp Gawd

    Messages:
    433
    Joined:
    Jun 17, 2006
    As a gtx 1080 owner since release, i am quite happy with the current performance of my card, but the gtx 1080 Ti has me hyped for what we can expect from volta!
     
    Raendor likes this.
  25. CSI_PC

    CSI_PC 2[H]4U

    Messages:
    2,096
    Joined:
    Apr 3, 2016
    Custom BIOS is available but only for the most expensive high end models that are usually used as well for LN2, only 2 custom BIOS I know are for the GTX1080 HoF and the GTX1080 top Strix while not doing much on other models.

    Doh I misunderstood you post :)
    But from his reviews I have read the voltage did not seem to be fixed at 50mV (seen him with better results on some AMD 480 with lower voltages compared to others needing more), but I must say I am not sure his methodology-approach in use of said voltages.
    Cheers
     
  26. SomeGuy133

    SomeGuy133 2[H]4U

    Messages:
    3,416
    Joined:
    Apr 12, 2015
    Not sure why there wouldnt be. 980TIs had custom BIOS. I never bothered because my 980TI had more software controls than standard and it was sufficient. 30 or 40mv and 130% power was all I needed to get my desired results for 980TI.
     
  27. CSI_PC

    CSI_PC 2[H]4U

    Messages:
    2,096
    Joined:
    Apr 3, 2016
    Well in interviews Tom Peterson (engineer but in technical marketing at Nvidia) has mentioned it is to protect the life of the silicon-transistors as it starts to degrade with increased voltage and they want it to have a life of 5 years - yeah try not to laugh too hard at that when most of us upgrade every 14 months and Nvidia introduces new stuff every 12 months :)
    He accepts they are being overly conservative in the limit of that control, but this is part of what happens with the node shrink with higher density and tighter tolerance/thresholds.
    Part of it is because they see it as a race by various AIB's and consumers pushing it ever higher and reducing the statistic life span too much, and I would say with this node shrink it is more sensitive than in the past and with a much narrower window.
    Cheers
     
    Last edited: Mar 11, 2017
    pandora's box likes this.
  28. Raendor

    Raendor Gawd

    Messages:
    636
    Joined:
    Sep 21, 2015
    Oh yeah. If they managed to ~ double performance of x70, x80, x80ti card from previous gen this time, I wonder what Volta will bring. If Intel brings 6 cores to mainstream - I might do complete upgrade and move current hw to itx box console-killer in the next year.
     
  29. SomeGuy133

    SomeGuy133 2[H]4U

    Messages:
    3,416
    Joined:
    Apr 12, 2015
    I am more intersted in HBM for the reason of having smaller cards.
     
  30. ZLoth

    ZLoth Gawd

    Messages:
    622
    Joined:
    Apr 13, 2010
    Here are my thoughts....

    The 1080Ti card, or any video card for that matter, is only as good as the monitor it is hooked up to and how well optimized the graphics are. When I built my computer in September, 2014, I did splurge on the 980 card, even having the wait two months until the model I wanted came in stock. With some minor exceptions, I got capped out at 60FPS on my then current Acer H6 H276HLbmid. That meant that when the 980Ti, and then the 1080 were introduced, I would see almost no performance benefit because the limiting factor was my monitor. That changed in December when I got my Viewsonic G-Sync.

    Looking at the numbers, I am drooling over the 2560x1440 numbers for framerate. They aren't hitting 165FPS which is the maximum framerate for monitors, but they are more consistently hitting 60FPS averages and sometimes exceeds them. This is a good thing. It would be interesting when we get away from Founders Edition cards and go into the AIB overclocked versions with the customer heatsink versions. And, this is why I'm going to be waiting a few months until the introduction shortages work themselves out and we see what type of reaction we see from Team Red.

    One thing that I did notice, at least on the Founders Edition cards, is that the connectors are only Displayport and HDMI, with DVI finally being ditched (although you can use a Displayport to DVI cable). HDMI connectors were on cards as early as the 480Ti, while Displayport was included on the 680Ti. (This, by the way, was doing a Google image search, so please correct me). I think some of the AIB will still include a DVI connector.
     
  31. Quartz-1

    Quartz-1 [H]ardness Supreme

    Messages:
    4,182
    Joined:
    May 20, 2011
    I beg to differ. People don't only game on their PCs and a high-DPI monitor - I have a 28" 4K monitor and a 24" 4K monitor - makes text usage much nicer.
     
  32. Quartz-1

    Quartz-1 [H]ardness Supreme

    Messages:
    4,182
    Joined:
    May 20, 2011
    I have a specific question about the 1080 Ti. My Maxwell Titan X will not run slow - and thus quiet - in 2D mode if I have three monitors attached. Nor indeed if I have only the two 4K monitors attached. If I were to attach three 4K monitors to the 1080 Ti will it run slow and quiet?
     
  33. ghostwich

    ghostwich [H]ard|Gawd

    Messages:
    1,230
    Joined:
    Sep 10, 2014
    Wasn't this a bug in a driver a while back? Somehow multi-mon setups cause the cards to run at full clocks?

    http://www.144hzmonitors.com/other/...or-multi-monitor-setups-at-high-refresh-rate/

    Something like that? If it's not resolved in the newest drivers, I would have a quick (yeah I know) look at their geforce forums to see if it's been reported.
     
  34. Quartz-1

    Quartz-1 [H]ardness Supreme

    Messages:
    4,182
    Joined:
    May 20, 2011
    Not so much a bug but a design flaw. It's been present since at least the 780s.
     
  35. pandora's box

    pandora's box [H]ardness Supreme

    Messages:
    4,103
    Joined:
    Sep 7, 2004
    That driver bug keeps getting fixed and then broken. Not sure of its current status but if a Titan X (Maxwell) is still doing it I am assuming its not fixed currently.
     
  36. SomeGuy133

    SomeGuy133 2[H]4U

    Messages:
    3,416
    Joined:
    Apr 12, 2015
    165 hz is trash plus its always CPU limited on single thread games and even multi threaded games. Single thread games struggle 120hz on my rig below. Additionally, if you go NVidia 120hz ULMB is multiple times better than 165hz (go to blurbusters.com and educate yourself for the love of anything) but realize minimum frame rate is critical with ULMB or stuttering is horrific. Even on my rig below single thread games suck with stutter and GPU intensive games are an issue too. Can't wait for my 1080TI...waiting for AIB ones.

    I have 4K on a 27in screen the Dell 4K 27 in IPS and i do photoshop (used to) and game on it and 4K is not wasted on 27 in. Anyone who says it is...is flat out ignorant or an idiot. 4K on 32 in is gold but 27 works. I just didn't have 2 grand for the 32 in so 500 for 27 in was a wise choice but not wasted. Plus those 28 in TN are utter fucking trash. Really sick of stupid people keep saying this dumbass shit.

    How many times have i said this now? 40?

    EDIT: again your eyes see 600DPI from 1 or 2 feet away and 8K 30 in screen is like 200 or 300 DPI so 8K 32 inch screen still isn't the max your eyes can tell. I forget the exact numbers but in the age of the fucking internet can no one still use their brain and google before spewing and spreading this garbage?

    275.36 PPI...just did the math for you. 32 in 8K is 275....almost perfect for the human eye at 2 feet away.

    This is why VR is pixelly as fuck and total trash ATM. The DPI of VR is way too small. We really need 8K or 16K VR headsets for really cool stuff. 4K VR is bare minimum in my book. Once we get 8K/16K VR headsets we can start doing really cool virtual desktops!
    http://techdissected.com/ask-ted/ask-ted-how-many-ppi-can-the-human-eye-see/

    I posted the exact numbers in the past but just google man. your eyes and body has amazing response times. You can feel the different in response times down to 20ms if not less! I post about IBM 80s study on rapid responses. sub 100ms response times are amazing but shitty UI makes shit 300-500ms and its terrible. Same with your eyes can tell amazing amounts of color and blur. 2ms vs 1ms is huge in clarity.

    Educate yourself. It is 2-3x faster but perception of clarity is magnitudes better.
    https://www.blurbusters.com/zero-motion-blur/10vs50vs100/
    https://www.blurbusters.com/faq/oled-motion-blur/
    [​IMG]

    [​IMG]
     
    Last edited: Mar 13, 2017
    Mav451 and razor1 like this.
  37. ZLoth

    ZLoth Gawd

    Messages:
    622
    Joined:
    Apr 13, 2010
    And, you point, as it relates to my post, is..... what again?

    Per the tests conducted by [H], the highest average framerate for a game using the 1080Ti at 2560x1440 resolution was 140FPS from Sniper Elite 4 and 159FPS using Doom. Those were the numbers that I was looking and what concerned me under my current setup. I didn't say "2K" or "4k".

    I also stated that the performance of the card is monitor dependent. Where am I wrong on this?

    And, what is the actual resolution you are talking about when you say "4K"? Because, when I hear "4K", I think of those "4K" television sets where the actual resolution is more like 2Kish.
     
  38. SomeGuy133

    SomeGuy133 2[H]4U

    Messages:
    3,416
    Joined:
    Apr 12, 2015
    EDIT: edited my post above adding zara quote and baby fed you guys and googled human eye DPI.......

    ~300DPI at 2 feet.

    The 4K part was referring to zara...forogt to quote him too. Sorry for the confusion but...

    4K refers to 4000x_______ FHD is 2K 2560x1440 is 2K/2.5K. _K goes off the first number. 4K monitor is ussually 3840x2160
    https://en.wikipedia.org/wiki/4K_resolution

    and if you dont understand my post than I won't waste anymore time typing because i will simply be wasting my time on you. *facepalm*


     
    Last edited: Mar 13, 2017
    razor1 likes this.
  39. ZLoth

    ZLoth Gawd

    Messages:
    622
    Joined:
    Apr 13, 2010
    Since you edited your reply to apply some clarification..... I'm going to reply as well from *MY* viewpoint.

    The maximum framerate of the my panel in my current monitor is 144Hz, but it can be overclocked to 165Hz. It says so in the specification. That, to me, means a maximum theoretical framerate of 165 frames per second. The higher the framerate, the smoother the animation. Isn't it safe to say that the 1080Ti card achieves higher framerates than my current 980 card? [H] and multiple others sites say "YES!". Yes, some of the graphics performance is dependent on the CPU, which is why [H] and other sites test several games, not just one or two.

    Now, in looking at Newegg, if you want a 5K (5120 x 2880) or a 4K (3840 x 2160) monitor, then the highest refresh rate is 60Hz. Drop down to 2K (2560 x 1440), and the max refresh rate goes to 144Hz. Since I prefer IPS monitors, it's safe to say that the DPI is higher on a monitor whose maximum resolution is 2560x1440 than it is under 1920x1080. I do notice the smoothness of the fonts and the additional screen real estate under the higher resolution, not to mention the importance when editing photos.

    Now, doesn't G-Sync and associated adaptive sync technology which includes Lightboost help with the "blur" factor? I've noticed the difference when I upgraded my monitor.

    If anything, your post just confirms my initial point.... which is that the graphics card performance is dependent on the monitor that is hooked up.
     
    Last edited: Mar 13, 2017
  40. chenw

    chenw 2[H]4U

    Messages:
    3,535
    Joined:
    Oct 26, 2014
    The problem with ALL monitor related arguments are, and ALWAYS will be, subjective.

    One can argue until dawn of time about the benefits of A while saying B is trash (when A and B are exclusive), you cannot force another person to see what he/she doesn't or cannot see.

    I have tried both ULMB and G-Sync on my monitor, and TBPH, I cannot actually tell the difference between them, and this is playing CS:GO to make sure I was hitting the framerates required to maximise ULMB and minimise G-Sync benefits.

    I noticed absolutely no difference. The only difference I could see, was in Fallout 4, but that difference was only when I was moving the mouse EXTREMELY quickly, and if I was actually looking for it.

    So I don't use ULMB because it severely limits my monitor's maximum brightness, which during the day requires me to go beyond 120 nits due to how bright my room becomes during the morning, ULMB can barely reach that high (max pulse width and birghtness).