2080ti 2080 Ownership Club

Discussion in 'nVidia Flavor' started by Comixbooks, Oct 5, 2018.

  1. Compddd

    Compddd [H]ard|Gawd

    Messages:
    1,568
    Joined:
    Aug 6, 2003
    If I set my power limit from 100 to 112 in Afterburner, will that allow my 2080 Ti Black to boost to higher clocks?
     
  2. Maddness

    Maddness [H]ard|Gawd

    Messages:
    1,209
    Joined:
    Oct 24, 2014
    That is what should happen. I world be very interested in hearing your results. Is your card a 300 or 300a bios.
     
  3. Compddd

    Compddd [H]ard|Gawd

    Messages:
    1,568
    Joined:
    Aug 6, 2003
  4. XoR_

    XoR_ Gawd

    Messages:
    814
    Joined:
    Jan 18, 2016
    Yes, it will allow your 2080Ti Black to burn faster
    RmIB6zl.gif
     
  5. Compddd

    Compddd [H]ard|Gawd

    Messages:
    1,568
    Joined:
    Aug 6, 2003
    It seems to boost in the low 1800's during gaming. Is that pretty good for a $999 Ti? Any way to check if its a 300 or 300A chip?
     
  6. Maddness

    Maddness [H]ard|Gawd

    Messages:
    1,209
    Joined:
    Oct 24, 2014
    GPU-z will tell you.
     
  7. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,192
    Joined:
    Feb 22, 2012
    My Asus Dual boosted into the high 1800s. If it settles around there that’s fine. You won’t get much higher without a higher power limit found on the $1300+ cards. Max stable on any card with high PL and great cooling is usually 2050-2100. So you are within 10% of best case.
     
  8. Compddd

    Compddd [H]ard|Gawd

    Messages:
    1,568
    Joined:
    Aug 6, 2003
    I am looking at GPU-Z and I can't tell where it is lol, what field should I be looking in?
     
  9. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,606
    Joined:
    Jun 13, 2003
    Post some screenshots
     
  10. Compddd

    Compddd [H]ard|Gawd

    Messages:
    1,568
    Joined:
    Aug 6, 2003
  11. Maddness

    Maddness [H]ard|Gawd

    Messages:
    1,209
    Joined:
    Oct 24, 2014
    I'm pretty sure outs on the advanced tab. The main difference is the 300a can have a higher power limit up too 380 watts. You are able to flash it with different bioses. The 300 has a power limit of 112%.
     
  12. Compddd

    Compddd [H]ard|Gawd

    Messages:
    1,568
    Joined:
    Aug 6, 2003
    my power limit in Afterburner is only 112% so I am pretty sure I have a 300 chip then.
     
  13. XoR_

    XoR_ Gawd

    Messages:
    814
    Joined:
    Jan 18, 2016
    If it is EVGA Black and price was "only" 999$ then there is 100% probability it is "300" chip.
    Not that is matter that much anyway as this whole chip segregation seems to artificial, simple ploy to disallow card manufacturers from selling overclocked cards or set higher TDP in models that are cheaper than Founders Edition. All Turing chips overclock pretty much the same anyway.

    What matter is TDP limit in BIOS because especially with OC it can limit performance somewhat.
    I would not worry too much about it. In games TDP does not hit these limits that often and especially without something like i9 9900K and especially if you do not have 4K monitor.

    Anyway, 999$ should be the max these cards should cost and non-FE chip is better value for money (y)
     
  14. Compddd

    Compddd [H]ard|Gawd

    Messages:
    1,568
    Joined:
    Aug 6, 2003
    My boost is all over the place during gaming, sometimes it’s mid 1800s and then goes to mid 1600s. Is that normal?
     
  15. Commander Shepard

    Commander Shepard 2[H]4U

    Messages:
    3,819
    Joined:
    Apr 12, 2016
    All the 2080Ti flame out problems drove me to the 2080. I'm perfectly satisfied with its performance. My agency pays for all my hardware, so I feel extremely lucky to get a $900 GPU for nothing.
     
  16. doughead

    doughead Limp Gawd

    Messages:
    142
    Joined:
    Sep 3, 2010
    I am considered one of the dunce "side-graders" who went from 2x GTX 1080s SLI to a GTX 2080Ti. It was my first time getting a "Ti' series card as all along allamy rigs were using SLI (GTX 580/780/1080s x2) so I mistakenly thought 2x GTX 1080Tis = a GTX 2080Ti.

    To make a long story short, I wasn't disappointed with the GTX 2080Ti although I wasn't seeing any major boost in frame rates coming from GTX 1080x2 SLI. BUT what a whopping difference it made to non-SLI supported (or not scaled well SLI) games like Doom, Wolfenstein, Star Wars Battlefront, Titanfall 2, Resident Evil 7 etc.

    Out of curiousity, I tried using one of the GTX 1080 as a dedicated Physx card to compliment the GTX 2080Ti and found rather odd results:

    Nvidia Physx tests
    (A) GTX2080Ti + GTX1080 vs
    (B) GTX2080Ti alone

    * Metro 2033 benchmark
    (A) 92 fps
    (B) 94 fps (!)

    * Batman Arkham Knight
    (A) 101 fps
    (B) 95 fps

    * Rise of Tomb Raider
    (A) 118 fps
    (B) 113 fps

    * Shadow of Tomb Raider
    (A) 84 fps
    (B) 86 fps (!)

    The use of the GTX 1080 as a dedicated Physx card was simply not worth it with just marginal boost in frame rates ranging from 2 to 6 fps and strangely some instamced being against its favour! Not only that, it also showed that the GTX 2080Ti certainly has no problem running Physx effects maxed out on its own, without the need for a dedicated GPU, as the frame rates were all comfortably above 90 fps.

    But the bottomline was that I had expanded my upgrade options going from 2x GTX 1080s SLI to a single GTX 2080Ti as my next upgrade path in future would be to add another GTX 2080Ti for SLI. As for added bonuses like Ray tracing and DLSS, I have yet to see if the improved eye candy would require a sacrifice in frame rates so I'm not too optimistic about these features yet.
     
    cybereality likes this.
  17. NukeDukem

    NukeDukem 2[H]4U

    Messages:
    2,265
    Joined:
    Feb 15, 2011
    The Strix O11G is a beast! I love the looks of this thing. It reminds me of something created by Cyberdyne Systems:

    IMG_0275.JPG
     
    Last edited: Jan 2, 2019
  18. Furious_Styles

    Furious_Styles [H]ard|Gawd

    Messages:
    1,375
    Joined:
    Jan 16, 2013
    You're not a dunce, SLI is generally inferior these days to a single card. Much less headaches trying to get profiles and games working properly.
     
  19. doughead

    doughead Limp Gawd

    Messages:
    142
    Joined:
    Sep 3, 2010
    By the way i just got BF5 at half price and couldnt wait to try out ray tracing. Just like to share my findings on my i7-4770K @ 4.2GHz/16 GB RAM/GTX 2080Ti/Win10 vers 1809 setup:

    Framerates tested at the beginning of Story mode's "Under no flag" first "Follow Mason" level
    DX11 (DXR disabled) : 110 fps
    DX12 (DXR enabled @ Ultra) : 52 fps
    DX12 (DXR enabled @ High) : 52 fps
    DX12 (DXR enabled @ Medium) : 80 fps

    This is with all graphics settings at Ultra except TAA at Low running at ultra-wide 3440x1440 resolution.
    Medium DXR seems to be the sweet spot with everything at Ultra but im sure it can run at DXR High/Ultra if some of the graphics settings are lowered correspondingly.

    I'll settle for higher framerates at DX11 with ray-tracing off.
    I play BF5 to engage in battle, not to admire my own reflection off a glossy metalic car/tank/barrel of my rifle.
    That would really be neat on a slower paced platform or adventure RPG game (Tomb Raider, Bioshock, Spiderman, Witcher) but not a fast paced war FPS.
     
    noko, Maddness and cybereality like this.
  20. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,606
    Joined:
    Jun 13, 2003
    A bit to be expected with BFV; I might leave it on for the single-player campaign stuff, but multiplayer will be multiplayer.
     
  21. Polytonic

    Polytonic n00b

    Messages:
    41
    Joined:
    Jan 3, 2011
    I came to the same conclusion (traded my single GTX 1080 for a single RTX 2080 Ti) -- have you considered using one of your 1080s as an NVENC card instead?

    As I understand it, the RTX cards are supposed to have a more efficient NVENC chip onboard, but I'm wondering if maybe it makes sense to use the 1080 instead for local recording (at extremely high bitrates) without consuming bandwidth on the main rendering card.
     
  22. doughead

    doughead Limp Gawd

    Messages:
    142
    Joined:
    Sep 3, 2010
    What is NVENC? Anyways I've already sold both my GTX 1080s.
     
  23. Legendary Gamer

    Legendary Gamer Gawd

    Messages:
    544
    Joined:
    Jan 14, 2012
    Only thing I can think of is dedicating it as a Physx board... Which I believe you still can do. Otherwise, I'm as lost as you are.
     
  24. CAD4466HK

    CAD4466HK [H]ard|Gawd

    Messages:
    1,218
    Joined:
    Jul 24, 2008
    NVENC is the video encoding hardware that has been on every Nvidia GPU since Kepler.
    It allows the GPU to render video in hardware using it's own codec support, thus allowing the GPU to offload the work from the CPU. Every new gen of NVENC adds more codec support or increases FPS thresholds, allowing less and less reliance on the CPU and software. Turing is 6th generation NVENC.

    Every time you watch a movie or use OBS to capture or stream for example, you are using this engine.
    The next time you watch a movie, have GPU-Z running in the background and pay attention to the video engine slot, that shows you your NVENC usage.
     
    Legendary Gamer likes this.
  25. doughead

    doughead Limp Gawd

    Messages:
    142
    Joined:
    Sep 3, 2010
    Thanks. Learnt something new. Its good to know that the GPU can offer hardware acceleration for playing movies as well. Especially useful for 4K movies although I don't have the software to play 4K movies.
     
    CAD4466HK likes this.
  26. Legendary Gamer

    Legendary Gamer Gawd

    Messages:
    544
    Joined:
    Jan 14, 2012
    I knew video cards accelerated video but I had completely spaced the fact Nvidia had a dedicated rendering engine. As for 4k playback, it will get here in online streaming in some compressed as hell, upscaled nightmare that our cards will accelerate. I think there's only like 1-3 PC DVD drives that can handle playback (pioneer, LG maybe) it's not like playing HD DVD or standard Blu Ray on PC. The latter of which got really cheap to do over time. I gave up and bought a decent LG 4K player for 120 bucks after spending too much time looking at how I had to use an Intel IGPU to push it (no thanks).
     
  27. Legendary Gamer

    Legendary Gamer Gawd

    Messages:
    544
    Joined:
    Jan 14, 2012
    Ty for taking the time for the info. It's amazing what I take for granted with tech sometimes.
     
    CAD4466HK likes this.
  28. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,606
    Joined:
    Jun 13, 2003
    This is less the resolution and more the codec used. HEVC being to common one, is very resource-intensive to decode and even more resource-intensive to encode, but is also very efficient in terms of bandwidth. Personally, I feel that Netflix and Amazon do a pretty good job with 4k streaming. It's not UHD Bluray, but it gets the job done. I buy the discs of stuff that I want all the detail and the Dolby Atmos/Vision tracks for.

    [one community that's pretty well informed on whose encoders/decoders are doing well with which codecs is the Plex streaming community- right now, NVENC is the 'gold standard', while Intel's IGPs are actually pretty good, and AMD's aren't worth using- and for those looking to support a large number of streams, cheap Quadros work best!]
     
    Legendary Gamer likes this.
  29. Legendary Gamer

    Legendary Gamer Gawd

    Messages:
    544
    Joined:
    Jan 14, 2012
    Dude! Loving the information today! You guys rock! I'm totally checking that out. I was considering running Plex in my home once I convert like 100+ DVDs to digital media.

    What should I rip the DVDs with? I used to use things like make mkv but you seem to know a lot more about this stuff than I do. Is there a decent program out there that I can rip my HD DVD collection with? Only thing I want to hang on to these days are my Blu rays
     
  30. BoiseTech

    BoiseTech Limp Gawd

    Messages:
    319
    Joined:
    Mar 15, 2018
    If you don't have data caps just download them. Its legal because you own a copy.
     
  31. Legendary Gamer

    Legendary Gamer Gawd

    Messages:
    544
    Joined:
    Jan 14, 2012
    I'm fairly certain that torrenting them off of YTS or another pirate outlet is still illegal. I would have a lot of fun telling a judge that "I own a physical copy of the movie right here, so I went to a pirate site and downloaded an illegally ripped copy for myself...". Lol .

    Some film's come with access to the digital files. Ripping them to my local server keeps my ass and my IP address the hell away from copyright trolls.

    Yes, I know I could use a VPN but even those services aren't bulletproof.
     
  32. BoiseTech

    BoiseTech Limp Gawd

    Messages:
    319
    Joined:
    Mar 15, 2018

    If you're going to nitpick, even ripping it is illegal.

    https://lifehacker.com/5978326/is-it-legal-to-rip-a-dvd-that-i-own

    Don't torrent. Usenet (which is ssl encrypted) and then use a VPN to hide your dns, using killswitch to cut off internet if it fails. *cough* a friend of mine does this, and has for years, never ever received a single DMCA request.
     
  33. Legendary Gamer

    Legendary Gamer Gawd

    Messages:
    544
    Joined:
    Jan 14, 2012
    I know. Not not picking, just saying what's safer and uses no data. I'm an old IT guy that never touched the Usenet. Amazingly.

    I have a standard Comcast data cap... Which is amazing on Gig internet.

    To each their own, my friend.
     
    BoiseTech likes this.
  34. Solhokuten

    Solhokuten [H]ard|Gawd

    Messages:
    1,219
    Joined:
    Dec 9, 2009
    Did you reuse the thermal pads or did it come with a sheet?
     
  35. spintroniX

    spintroniX Gawd

    Messages:
    957
    Joined:
    Apr 7, 2009
    In for two Founders Edition cards.

    Spent a couple hours with some deep learning last night, and no evidence of failure yet. Fingers crossed.

    https://imgur.com/Vdlcz9a
     
    Solhokuten, DooKey, Porter_ and 2 others like this.
  36. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    18,526
    Joined:
    Jan 28, 2014
    So apparently EVGA released the hybrid kit a couple weeks before Christmas while I wasn't paying attention. Hopefully more will come in stock and I can grab one. Anyone get one and install it on theirs? People on the forum have been complaining about malfunctioning pumps and failed solder joints on the fan/RGB cable, so hopefully that was only an issue with the first batch.

    https://www.evga.com/products/product.aspx?pn=400-HY-1384-B1
     
  37. TahoeDust

    TahoeDust Limp Gawd

    Messages:
    494
    Joined:
    Dec 3, 2011
    It came with new pads installed on the block.
     
    Solhokuten likes this.
  38. TahoeDust

    TahoeDust Limp Gawd

    Messages:
    494
    Joined:
    Dec 3, 2011
  39. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,192
    Joined:
    Feb 22, 2012
    I had 3x 1080ti hybrids die from EVGA after a month. I think they just suck balls.

    I went full water block with my 2080ti... really not much more in the scheme of things.

    The kicker is with EVGA is that they wanted me to pay shipping cross country for three cards with a known issue. Fuck them. I liked Legendary Gamers’ thread where after four RMAs they would refund with a 15% restocking fee for his 2080ti.
     
    Last edited: Jan 9, 2019