Former or current 1080 ti/Titan (Pascal) owners

If you were the owner of a 1080 ti/titan did you buy a new 2080 ti/titan when released?

  • Yes, I bought a new 2080 ti/Titan.

    Votes: 21 15.1%
  • No, I am waiting for next gen/price drops.

    Votes: 108 77.7%
  • No, I bought an AMD card instead.

    Votes: 10 7.2%

  • Total voters
    139
Are you more comfortable with the location of the goal posts now that you have moved them somewhere new? The OP does not mention gaming. Your "definitive" answer talks about a tiny niche within FPS gaming which itself is a niche within the submarket of gaming. For your 0.001% niche, the 1080Ti isn't worth it either because improved image fidelity is counterproductive. For everybody else, the 2080 Ti is a massive improvement.

You're right I should have specified this is not work-related upgrades. In all honesty that won't matter much though, I appreciate as much feedback as people will give!
 
Are you more comfortable with the location of the goal posts now that you have moved them somewhere new? The OP does not mention gaming. Your "definitive" answer talks about a tiny niche within FPS gaming which itself is a niche within the submarket of gaming. For your 0.001% niche, the 1080Ti isn't worth it either because improved image fidelity is counterproductive. For everybody else, the 2080 Ti is a massive improvement.

The thread is about gaming cards, not productivity. You were just bragging, don't act like you weren't. The whole "A+ would do again" BS.
 
Last edited:
So,

I was going to buy a 2080ti on launch.

The whole raytracing thing is completely uninteresting to me. I was primarily interested in a 4k performance boost. The increases in performance over my Pascal Titan were not as large as I had hoped for, but I was willing to do it to get over the 60fps hump in all titles.

In the end, I delayed due to all the failures reported in early Founders Edition 2080Ti's.

Sure, they have warranty, but I use a custom water loop, and by the time I got the thing integrated into my loop which takes some work, I didn't want to have to disassemble everything again and then potentially fight with them to cover it even though the cooler was removed, so I decided to hold off.

By the time I got a comfort level that they had their shit under control, it was too late for me.

My theory is that top end GPU's and other hardware ONLY make sense when you buy them right at launch. Then you are spreading the value of having them out over a longer period of time before the next upgrade. Don't get me wrong, top end hardware never really makes financial sense, but it makes more financial sense if you own it for a longer time. Once I got to a comfort level that they had their shit under control, it was too late to buy for me, so I decided to hold off until the next gen.

It never made any sense to me that GPU's cost just as much right before they go obsolete as they do when they first hit the market. As far as I'm concerned, hardware is a ticking clock, slowly becoming less and less valuable for every second that passes post launch. If anything new prices should linearly decrease the longer the GPU is on the market. Then I might be able to justify buying at any time after the first couple of weeks after launch.
 
Also,

For those 6 of you who went from a 1080ti or Pascal Titan to an AMD card, may I ask why?

AMD's fastest still trades blows with these cards. At the very least, it wouldn't be enough of a performance increase to call it an "upgrade". :p
 
4K/120 will move the performance targets, eventually.


Heck, 4k60hz even pushes the performance targets if you want high quality settings. (And I don't see the point of play any game without cranking up the settings)

While many titles can hit that, the more demanding ones can't.

Deus Ex Mankind Divided, Fallout 4, heck, even Metro 2033 which is REALLY old right now can't consistently hit 60fpos on a 1080ti or Pascal Titan, regardless of how overclocked.

Then there are a bunch of titles which are right on the cusp. The Far Cry series for instance. 4, Primal, 5 and 5 New Dawn all hit 60:ish fps, but would frequently drop below, which is a no-no for me. I don't ever want to see a framerate below 60, even for a fraction of a second.
 
Having used both, there really isn't much reason to upgrade/sidegrade away from a TXP card.

TXP https://www.3dmark.com/fs/20800558

2080ti https://www.3dmark.com/fs/21270633
I recall there being notable gains at 4k in many titles, in the 20-30 percent range, but it's been a while since I read the launch reviews.

I'm not necessarily a believer that canned benchmarks like 3Dmark really tell us anything of importance. They are more for displaying epeen.
That's best case scenario in a synthetic test too. The 2080 Ti leaves a lot to be desired.


Link for full data, and individual title results

CtoerAVSSvnprmZCmBpeRV-650-80.png


That's a 28.7% increase at 97%, and 33.03% increase in average, over the 1080ti / Pascal Titan.

I'd imagine at lower resolutions, it winds up being more CPU bound and thus doesn't show much of any benefit.

~30% as we are seeing is not an insignificant increase. It's just not a $1200 increase. At least not for me, especially considering we are slowly approaching discontinuation of the 2080ti.
 
I built a new threadripper system as a workstation, daily driver, and gaming pc last November. The specs were initially as follows: 2950x, X399 MSI MEG Creation, 64gb DDR4, RTX 2080ti. I do really enjoy the build now, but god damn it has been a enormous pain in my ass to get to this point and a big part of that pain was the RTX 2080ti. I have to give some setup and background to explain just how frustrating the 2080 ti was for this system.

I preordered the 2080ti for this system and it was backordered. Finally received one of the early (micron memory) units - by this time I had already started seeing reports about the "space invaders" GPU death. I plugged the card into my old system and tested for a few days - a big sigh of relief, everything worked fine (so I thought). I completed the new system, mounted the 2080ti up in an EK block, and added it to the loop in my new build. The card ran extremely cool but couldn't overclocked beyond 2000mhz, couldn't handle any OC on memory. I think I settled on 1950mhz and stock memory, just figured it was poor luck of the draw with the silicon lottery.

Within a month or two I started getting random DXGI DEVICE REMOVED errors when using the card at extended load for any period of time. I could replicate this with rendering, with F@H, with games - always a similar outcome: after a random amount of time at 100%, sometimes minutes, sometimes hours, I would get a crash with some variant of a direct X error, device removed, etc.

At first I thought driver problems on a new card, then I figured I just REALLY lost on the silicon lottery with this card and removed all OC, uninstalled OC software, fresh installs of Nvidia drivers, the works. This seemed to help, but the card still crashed intermittently, maybe once week or so. I then thought it was related to the X399 motherboard which had it's share of memory troubles early on due to older AGESA.

Finally at the end of summer I started getting the "space invaders" corruption and finally RMA'd the 2080ti :mad:.

I'm now running my old 1080ti again with a new waterblock, the system runs great and I have a brand new 2080ti on my desk. I did test the new 2080ti in my old rig and it appears to be a newer revision with the samsung memory, but I'm still thinking I may just drop it in the F/S forum and wait for the next gen since the 1080ti runs like a champ under water.


TL,DR; Early 2080ti's with micron memory are ticking timebombs. Make sure you get a later Samsung memory variant or avoid like the plague.
 
I've yet to run across anything that the 1080ti I bought used for $500 doesn't handle perfectly fine.

I'd argue that is probably true for absolutely everything at 1080p

It's also probably true for most things at 1440p, unless you have a super high refresh monitor, and demand always sitting maxxed out.

Once you get to 4k - however - is when it starts showing its limitations.
 
...
My theory is that top end GPU's and other hardware ONLY make sense when you buy them right at launch. Then you are spreading the value of having them out over a longer period of time before the next upgrade. Don't get me wrong, top end hardware never really makes financial sense, but it makes more financial sense if you own it for a longer time.
...
This doesnt apply if you wait for better version cards to come out each time.
Then you will be using the card for exactly the same length of time.
The only downside is not having cards as early as possible but then you miss the "teething" issues.

Higher end cards make perfect sense if you want the best or very close the the best experience you can get when you can afford it.
Its by no means an expensive hobby, not compared to many others.
But there is a line most of us will draw on ridiculous price jumps, thats why I didnt bother with this gen.
 
Link for full data, and individual title results

View attachment 212114

That's a 28.7% increase at 97%, and 33.03% increase in average, over the 1080ti / Pascal Titan.

I'd imagine at lower resolutions, it winds up being more CPU bound and thus doesn't show much of any benefit.

~30% as we are seeing is not an insignificant increase. It's just not a $1200 increase. At least not for me, especially considering we are slowly approaching discontinuation of the 2080ti.

Right, so at best a 30% increase at 4K which impacts a very minimal % of the gaming audience, even at $1200. That's why I said it leaves a lot to be desired as ideally it should have been 50%+ for $1200+. I bought mine for $800 new and I still think I overpaid.
 
The thread is about gaming cards, not productivity. You were just bragging, don't act like you weren't. The whole "A+ would do again" BS.

It's about cards that you use for gaming. And your gaming is a tiny niche within gaming. For twitch-based shooters, frame rate reigns supreme and high image quality is actually a drawback because it makes the user's brain take longer for target acquisition. Congrats on your weird flex.

Within the broader genre of FPS gaming, the 2080 Ti is a fairly significant step up in performance for all users who aren't limited by their 1080p60 screens.

Within the broad spectrum of gaming, the 2080 Ti is, again, a fairly significant step up in performance for all users who aren't limited by their 1080p60 screens. Oh, and they also need to be playing newer games because the older ones weren't GPU limited to begin with.

For raw compute usage, the 2080 Ti represented a significant step up from the Titan Xp. It essentially saves two full days every single week - and that's without any type of RTX acceleration. The Titan Xp itself was a big step up from the 1080 Ti for this usage in that it saved nearly a full day every single week.

So, like I said, A+ and I would do the upgrade that was asked about in the OP again even if strictly for my gaming.
 
It's about cards that you use for gaming. And your gaming is a tiny niche within gaming. For twitch-based shooters, frame rate reigns supreme and high image quality is actually a drawback because it makes the user's brain take longer for target acquisition. Congrats on your weird flex.

Within the broader genre of FPS gaming, the 2080 Ti is a fairly significant step up in performance for all users who aren't limited by their 1080p60 screens.

Within the broad spectrum of gaming, the 2080 Ti is, again, a fairly significant step up in performance for all users who aren't limited by their 1080p60 screens. Oh, and they also need to be playing newer games because the older ones weren't GPU limited to begin with.

For raw compute usage, the 2080 Ti represented a significant step up from the Titan Xp. It essentially saves two full days every single week - and that's without any type of RTX acceleration. The Titan Xp itself was a big step up from the 1080 Ti for this usage in that it saved nearly a full day every single week.

So, like I said, A+ and I would do the upgrade that was asked about in the OP again even if strictly for my gaming.

Edit: nevermind, no pt in continuing this.
 
Last edited:
I built a new threadripper system as a workstation, daily driver, and gaming pc last November. The specs were initially as follows: 2950x, X399 MSI MEG Creation, 64gb DDR4, RTX 2080ti. I do really enjoy the build now, but god damn it has been a enormous pain in my ass to get to this point and a big part of that pain was the RTX 2080ti. I have to give some setup and background to explain just how frustrating the 2080 ti was for this system.

I preordered the 2080ti for this system and it was backordered. Finally received one of the early (micron memory) units - by this time I had already started seeing reports about the "space invaders" GPU death. I plugged the card into my old system and tested for a few days - a big sigh of relief, everything worked fine (so I thought). I completed the new system, mounted the 2080ti up in an EK block, and added it to the loop in my new build. The card ran extremely cool but couldn't overclocked beyond 2000mhz, couldn't handle any OC on memory. I think I settled on 1950mhz and stock memory, just figured it was poor luck of the draw with the silicon lottery.

Within a month or two I started getting random DXGI DEVICE REMOVED errors when using the card at extended load for any period of time. I could replicate this with rendering, with F@H, with games - always a similar outcome: after a random amount of time at 100%, sometimes minutes, sometimes hours, I would get a crash with some variant of a direct X error, device removed, etc.

At first I thought driver problems on a new card, then I figured I just REALLY lost on the silicon lottery with this card and removed all OC, uninstalled OC software, fresh installs of Nvidia drivers, the works. This seemed to help, but the card still crashed intermittently, maybe once week or so. I then thought it was related to the X399 motherboard which had it's share of memory troubles early on due to older AGESA.

Finally at the end of summer I started getting the "space invaders" corruption and finally RMA'd the 2080ti :mad:.

I'm now running my old 1080ti again with a new waterblock, the system runs great and I have a brand new 2080ti on my desk. I did test the new 2080ti in my old rig and it appears to be a newer revision with the samsung memory, but I'm still thinking I may just drop it in the F/S forum and wait for the next gen since the 1080ti runs like a champ under water.


TL,DR; Early 2080ti's with micron memory are ticking timebombs. Make sure you get a later Samsung memory variant or avoid like the plague.


Yeah I've read the early Micron builds were a disaster waiting to happen. I'm waiting on my block to arrive in a few days and it scares me to think I'd have to drain and disassemble this PC again if I get the space invaders mini game.
 
Too close to rtx 3080ti release and it seems will be better upgrade. So I am sticking with my 1080tis for now.
 
1080ti since launch, waiting on w/e next 1080ti RTX equivalent is next gen. Mostly for Cyberpunk. everything else runs fine.
 
At this point I can not afford an rtx 2080 ti. Anything else is a mini bump. So I'll wait.
 
I actually put my 1080 ti under water recently, which might seem counter intuitive, but I couldn't justify the price for the 2080 ti as much as I would like to have one.
 
I actually put my 1080 ti under water recently, which might seem counter intuitive, but I couldn't justify the price for the 2080 ti as much as I would like to have one.

I ran a 1080ti hybrid since launch in some shape or form.
I’m not chasing anything more than 165hz @1440p so I’m good for a while.
I’ve always liked staying under 50c at load.
I’m fine with lower quality recordings bc it’s a hobby I don’t bother pushing everyday.

If I was making a living solely from streaming FPS content then 9900k + 2080ti + a 240hz 1080p panel would be a must.
I’d definitely use a discreet capture card on a 2nd content dedicated box.

There are hobbyist streamers that went to rtx cards just for gpu encoder uplift on a single do it all build.
A couple youtubers with content creation channels have made some $ pushing recording and streaming rtx uplift.

I don’t know anyone besides big you tubers that pushed ray tracing vids at product launch.
Good on them, worthless feature for most of us.
 
I ran a 1080ti hybrid since launch in some shape or form.
I’m not chasing anything more than 165hz @1440p so I’m good for a while.
I’ve always liked staying under 50c at load.
I’m fine with lower quality recordings bc it’s a hobby I don’t bother pushing everyday.

If I was making a living solely from streaming FPS content then 9900k + 2080ti + a 240hz 1080p panel would be a must.
I’d definitely use a discreet capture card on a 2nd content dedicated box.

There are hobbyist streamers that went to rtx cards just for gpu encoder uplift on a single do it all build.
A couple youtubers with content creation channels have made some $ pushing recording and streaming rtx uplift.

I don’t know anyone besides big you tubers that pushed ray tracing vids at product launch.
Good on them, worthless feature for most of us.

interesting take, i'd never really considered that and thinking about it does make sense. they either pushed it or talked about it at length. i game at ultrawide 3440x1440 144 hz, so for certain games that i really want to get immersed in, the 2080 would definitely help where the 1080 struggles just a bit completely maxed (FFXV).
 
I tend to skip a gen with gfx cards. My 1080ti rocks most games max settings at 1440p. When I move to 4K, it'll be time, but that won't be till Ampere is on the scene.
 
I'm actually gaming at 4k/120hz and still using my 1080Ti. I was close to upgrading to 2080Ti, but it's really not a worthwhile upgrade. Nvidia did an ass move with 2xxx gen because they can, but I'm not going to support that shit. If next gen AMD cards are 30% faster than 1080Ti and 3080Ti 50% faster than 1080Ti, I will probably buy AMD.
 
Yes, I upgraded to a 2080ti. It was well worth it as my monitor is 4k, and the 1080ti just isn't enough for 4k.

Eh, if 1080Ti isn't enough for 4k, then 2080Ti isn't either because the fps boost is not that big to matter. You're just trying to justify your meaningless $1300 purchase.
 
Eh, if 1080Ti isn't enough for 4k, then 2080Ti isn't either because the fps boost is not that big to matter. You're just trying to justify your meaningless $1300 purchase.
But the difference is sufficient to make the difference between playable and stutter-fest, or not having to reduce your IQ settings.

Anyway it's his money.
 
Eh, if 1080Ti isn't enough for 4k, then 2080Ti isn't either because the fps boost is not that big to matter. You're just trying to justify your meaningless $1300 purchase.
It’s the difference between playable and unplayable. It’s at least 30% faster.

Not sure where you got $1300 from either. It’s $1k or $1200 if you got founders.
 
Not sure where you got $1300 from either. It’s $1k or $1200 if you got founders.

Not really...

"Budget" versions start at $1050 + tax and the better ones that likely boost better due to binning are in the neighborhood of $1200 + tax. $1300 is pretty accurate.
 
If you really want to wait for a $999 blower version to come back in stock and then pay tax on it, more power to you.

IMO blower cards don't get enough respect. They fill a very important niche in workstations for users / companies that don't want to pay the Quadro tax. The current RTX FE coolers are decent and look fantastic but can't exhaust hot air from a poorly ventilated chassis (most workstations from HP, Lenovo, Dell, etc...). Not to mention a workstation setup where you need to run a few of these cards side by side, not gonna happen with the current FE cards.
 
Eh, if 1080Ti isn't enough for 4k, then 2080Ti isn't either because the fps boost is not that big to matter. You're just trying to justify your meaningless $1300 purchase.
It’s the difference between playable and unplayable. It’s at least 30% faster.

Not sure where you got $1300 from either. It’s $1k or $1200 if you got founders.

I agree with Mchart on this one.

If we consider the 60fps the "playable" cutoff, having that extra 20-30 percent in 4k is the difference between playable and unplayable in many titles.

It isn't in all titles though, which is why I question the value of spending yet another $1,200 plus water block on a 2080ti and still not being able to get playable framerates in everything.

I'm holding off for the next gen. 3080ti?
 
IMO blower cards don't get enough respect. They fill a very important niche in workstations for users / companies that don't want to pay the Quadro tax. The current RTX FE coolers are decent and look fantastic but can't exhaust hot air from a poorly ventilated chassis (most workstations from HP, Lenovo, Dell, etc...). Not to mention a workstation setup where you need to run a few of these cards side by side, not gonna happen with the current FE cards.

Agreed. I have a 5820k Alienware system with that triangle case design, and it's built for the GPU to be a blower design. Nothing wrong with it. It's just noisier.
 
I agree with Mchart on this one.

If we consider the 60fps the "playable" cutoff, having that extra 20-30 percent in 4k is the difference between playable and unplayable in many titles.

It isn't in all titles though, which is why I question the value of spending yet another $1,200 plus water block on a 2080ti and still not being able to get playable framerates in everything.

I'm holding off for the next gen. 3080ti?

It won't get you 60+ in all titles, but it'll still get you 40-60 FPS which is better than 20-40 FPS. Even more so if you have g-sync which makes things tolerable whenever you are in the 40-sub 60 range.

I agree though, the 2080ti is pretty much a waste if you aren't doing 4k gaming. I only upgraded to it because I went from a 1440 to a 4k monitor. The 1080ti was enough for 1440, but not nearly enough for 4k.
 
interesting take, i'd never really considered that and thinking about it does make sense. they either pushed it or talked about it at length. i game at ultrawide 3440x1440 144 hz, so for certain games that i really want to get immersed in, the 2080 would definitely help where the 1080 struggles just a bit completely maxed (FFXV).

I have an X34 that is used for everything else but fps gaming. It was cool seeing all that real estate in games that support 3440x1440. My setup can push 100+ frames at that Rez. Playing RTS, Civ, etc give me much more command of what I'm doing.

It felt noticeably laggier than 1440p trying to snipe, negotiate maps flick shooting, or rapidly trying to reposition.

Going down to 1080p became problematic coming from 1440p bc you don't realize how much extra mouse scanning you aren't doing. Cod MW Ground War is a perfect example of needing as much info as fast as possible to keep from dying constantly from all ranges and angles.
 
I don't quite fit this threads niche BUT...

I'm still using a 980Ti @ 1440mhz and waiting for next gen.. I play games at 1080P 144hz OR I play on my 4k tv with resolution scale down to 60% or so..

980Ti has JUST reached the end of it's service life as a 1440P@60hz card with games like RDR2, have to dial the settings back to maintain 60 FPS, so it's time. Most of my gaming time is just the weekend PUBG benders with my friend, and this card easily cranks out high enough FPS @ 1080P on that game, so I'm not terribly desperate to upgrade. When I do, I'll also get a 1440P 144hz panel..

honestly i'm still tempted to just go with a used 1080Ti, but I figure by summer-fall '20 1080Ti will be a $250-300 used card and better alternatives will exist in the $400-600 market.. I believe I paid around $549 for my 980Ti, so the prices on new x80Ti cards is just too much for me to handle. I can say I got my moneys worth on the 980Ti though.
 
My Xstar 1440p is Duel link dvi so I was not able to get it to work with the 1080ti (didn't want to spend money on an active adapter) so I have been running 1080p again which kinda sucks. I have been waiting for a new monitor and gpu with the new display port revision to come out before I upgrade. I don't really game enough to justify a upgrade every new release so I wait a few gens in between. Hopefully it comes soon because I am getting tired of the small screen size/res. Maybe I will just get a Samsung tv like others have done to hold me over then use it as a secondary display when the new monitors release.
 
Back
Top