Former or current 1080 ti/Titan (Pascal) owners

If you were the owner of a 1080 ti/titan did you buy a new 2080 ti/titan when released?

  • Yes, I bought a new 2080 ti/Titan.

    Votes: 21 15.1%
  • No, I am waiting for next gen/price drops.

    Votes: 108 77.7%
  • No, I bought an AMD card instead.

    Votes: 10 7.2%

  • Total voters
    139

Furious_Styles

Supreme [H]ardness
Joined
Jan 16, 2013
Messages
4,515
Was having a discussion with someone else in another thread and we disagreed on whether or not hard forum members were less likely to buy/own the new 2080 ti/titan or if it was about the same, or perhaps more.

Please chime in and vote if you were an owner of one of these cards!
 
The 1080 series is still doing fine, and I just can't accept the price of the 2080. I am just not paying 700-1200 or more for a video card.
 
No reason to switch from a 1080ti to a 2080ti for a 20%-30% increase. RTX as of right now is not even relevant so not a big deal.
If I do game it's at 1440 and I also VR. If I was still at 4K, I still would not, though it would make more sense.
 
My 1080ti is still doing fine at 1440p. If I was still running eyefinity \ surround or moved to 4k I may have been able to justify a 2080, but as is it was just too much of a price hike.
 
I needed more power than a 1080 Ti for VR, so I upgraded, but I waited until I could at least get a fancy one with a water jacket.
 
Still on a 1080ti, the only gaming i do is csgo on a 1440p monitor at 155Hz
 
Yes, I upgraded to a 2080ti. It was well worth it as my monitor is 4k, and the 1080ti just isn't enough for 4k.
 
It was a worthy upgrade for me, before the 1080 Ti couldn't hold 60fps in Witcher 3 at 4K consistently even with hairworks and AA off, but with the 2080 Ti I can turn on AA and hairworks and I've never seen a dip below 60.
 
still on a Evga 1080ti as well. I usually skip a few gens between upgrades especially with how much they want for the high end cards.
 
It was a worthy upgrade for me, before the 1080 Ti couldn't hold 60fps in Witcher 3 at 4K consistently even with hairworks and AA off, but with the 2080 Ti I can turn on AA and hairworks and I've never seen a dip below 60.
That sounds surprising that the 1080Ti couldn't hold 60+ FPS in Witcher 3 at 4K. Although not sure why you would run AA @ 4K unless you're gaming on a very large screen.
 
That sounds surprising that the 1080Ti couldn't hold 60+ FPS in Witcher 3 at 4K. Although not sure why you would run AA @ 4K unless you're gaming on a very large screen.
And why would that sound surprising? 1080ti can't keep 60fps in hardly any modern games at 4k even plenty of games that look worse than Witcher 3. And 4K doesn't mean you no longer need AA even on a smaller screen. Really that topic has been discussed to death and it really just varies from game to game on how bad the aliasing is.
 
Last edited:
That sounds surprising that the 1080Ti couldn't hold 60+ FPS in Witcher 3 at 4K. Although not sure why you would run AA @ 4K unless you're gaming on a very large screen.

1080ti can do about 70fps average but in a couple places--Skillege after the wild hunt arrives, and in the wilderness in kaer morhen it drops to around 48-52fps. 2080ti averages around 90fps and can do 70+ in those tough scenes.

I probably wouldnt care as much now that nvidia is supporting freesync monitors but when all you have is vsync the game looks brutally bad with the hitching in the low 50fps range.
 
That sounds surprising that the 1080Ti couldn't hold 60+ FPS in Witcher 3 at 4K. Although not sure why you would run AA @ 4K unless you're gaming on a very large screen.
You still need AA regardless of pixel density due to things like shimmering. Pixel density helps the rough edge problem only.

2080ti is up to 40% faster compared to the 1080ti in games where the memory bandwidth increase helps out more. It’s the only card that will do 4K with acceptable frame rates.
 
You still need AA regardless of pixel density due to things like shimmering. Pixel density helps the rough edge problem only.

2080ti is up to 40% faster compared to the 1080ti in games where the memory bandwidth increase helps out more. It’s the only card that will do 4K with acceptable frame rates.
Which games are 40%? I have seen the Witcher but it was not a whole lot faster. Any links?
 
Which games are 40%? I have seen the Witcher but it was not a whole lot faster. Any links?

There's a lot if the framerates are not in CPU constrained territory.

https://www.tomshardware.com/reviews/nvidia-geforce-rtx-2080-ti-founders-edition,5805-3.html

Problem is most reviews had the 1080ti at 60-70 fps and then the 2080ti would be in the 100s CPU constrained. [H] had a really good 4k review but that's gone now...

If you're looking at high Hz, it's more in the 25% range. If you're aiming for ~60-80 Hz it's in the 35-45% range. Personally I aim for around 70-80Hz and I saw gains around the 40-50% range (custom cooled + bios/hard mods) over the 1080ti.
 
There's a lot if the framerates are not in CPU constrained territory.

https://www.tomshardware.com/reviews/nvidia-geforce-rtx-2080-ti-founders-edition,5805-3.html

Problem is most reviews had the 1080ti at 60-70 fps and then the 2080ti would be in the 100s CPU constrained. [H] had a really good 4k review but that's gone now...

If you're looking at high Hz, it's more in the 25% range. If you're aiming for ~60-80 Hz it's in the 35-45% range. Personally I aim for around 70-80Hz and I saw gains around the 40-50% range (custom cooled + bios/hard mods) over the 1080ti.
Yeah, its hard to find a good benchmark. Seems they used the crappiest 1080ti against the best 2080ti, then posted the results. Then some do not even post the cards.
 
Yeah, its hard to find a good benchmark. Seems they used the crappiest 1080ti against the best 2080ti, then posted the results. Then some do not even post the cards.

Yeah... and the 2080ti has a lot less overclocking headroom than the 1080ti. So maybe that 45% "best case" scenario is more like 35-40%. It's not a terrible gain, it's the price that turned everyone off.

I don't regret buying one though - I've had it almost two years and I had a 10% coupon, so $1080 - $500 for my 1080ti = $580 over two years. It's a cheap hobby. I finally have a IQ I am happy with and it fits nicely with the Vive Pro for VR. I figure I'll skip Ampere and let Intel/AMD/nVidia fight it out for a while. If MCM is a thing in 2020, I want to skip the first generation of it.
 
I upgraded. I usually sell the previous gen to cut down on the upgrade cost. Works really well and I always have a super fast current gen card that way.
 
Yeah... and the 2080ti has a lot less overclocking headroom than the 1080ti. So maybe that 45% "best case" scenario is more like 35-40%. It's not a terrible gain, it's the price that turned everyone off.

I don't regret buying one though - I've had it almost two years and I had a 10% coupon, so $1080 - $500 for my 1080ti = $580 over two years. It's a cheap hobby. I finally have a IQ I am happy with and it fits nicely with the Vive Pro for VR. I figure I'll skip Ampere and let Intel/AMD/nVidia fight it out for a while. If MCM is a thing in 2020, I want to skip the first generation of it.
Yeah, that is the point I am at really. I would like to get a new VR sometime but if my 1080ti can't handle it, then I would do an upgrade. I do hope Intel/AMD brings out some awesome stuff. Maybe it will drive down prices to normal again.
 
I bought my 1080Ti used a year ago and I plan to keep it a while longer since it does fine for the games I play at 4K. I would love having extra performance since I have a high refresh rate display but I can't justify the cost of a 2080Ti. I'm going to be watching the launch of Ampere and big Navi very closely though.
 
Yeah... and the 2080ti has a lot less overclocking headroom than the 1080ti. So maybe that 45% "best case" scenario is more like 35-40%. It's not a terrible gain, it's the price that turned everyone off.

I don't regret buying one though - I've had it almost two years and I had a 10% coupon, so $1080 - $500 for my 1080ti = $580 over two years. It's a cheap hobby. I finally have a IQ I am happy with and it fits nicely with the Vive Pro for VR. I figure I'll skip Ampere and let Intel/AMD/nVidia fight it out for a while. If MCM is a thing in 2020, I want to skip the first generation of it.

Turing was released a little over a year ago. It might seem like 2 but it hasn't been that long yet.

Anyway, I can definitively say Turing or specifically 2080 Ti is not worth it if you have a 1080 Ti/Titan X Pascal. The 2080 Ti I have is faster than my Titan X in every metric but I can't objectively feel that difference while gaming. So save your money and wait until next generation consoles are in full swing and buy the second iteration of Ampere or whatever.
 
Turing was released a little over a year ago. It might seem like 2 but it hasn't been that long yet.

Anyway, I can definitively say Turing or specifically 2080 Ti is not worth it if you have a 1080 Ti/Titan X Pascal. The 2080 Ti I have is faster than my Titan X in every metric but I can't objectively feel that difference while gaming. So save your money and wait until next generation consoles are in full swing and buy the second iteration of Ampere or whatever.

lol, it was a long year! I personally can feel the difference. There’s a certain level of IQ that stops shimmering in my favorite game (world of warships) that drops me below 60Hz on a non-VRR screen with a 1080ti. So for me it’s worth it.
 
Turing was released a little over a year ago. It might seem like 2 but it hasn't been that long yet.

Anyway, I can definitively say Turing or specifically 2080 Ti is not worth it if you have a 1080 Ti/Titan X Pascal. The 2080 Ti I have is faster than my Titan X in every metric but I can't objectively feel that difference while gaming. So save your money and wait until next generation consoles are in full swing and buy the second iteration of Ampere or whatever.
Yeah, no. 30-40% is noticeable at 4K when it means the difference of just staying above 30fps or not.

It also gets you up to 60fps+ in the games where you’d be hovering around the mid 40s.

If you can’t notice that, good for you. But I could, easily.
 
lol, it was a long year! I personally can feel the difference. There’s a certain level of IQ that stops shimmering in my favorite game (world of warships) that drops me below 60Hz on a non-VRR screen with a 1080ti. So for me it’s worth it.

I suppose at 4k that may be true. I game at 1080p 240 hz and mainly play shooters so for me consistent high fps is paramount and both my OC Titan X and 2080 Ti hit my fps cap (200) in Apex Legends. I tried Witcher 3 and the 2080 Ti is faster but didn't feel any smoother. And yeah it feels like Turing has been around a long time.
 
I kept my 1080ti, and I plan to upgrade when the next gen comes out. That way I can give my 1080ti to my wife! Win/Win for the both of us laugh.
 
Yeah, no. 30-40% is noticeable at 4K when it means the difference of just staying above 30fps or not.

It also gets you up to 60fps+ in the games where you’d be hovering around the mid 40s.

If you can’t notice that, good for you. But I could, easily.

I bet even most people buying 2080 Ti don't use it for 4k gaming since the majority of gamers are at 1080p. Even if you make the argument that 2080 Ti targets 4k, it would still only be a fraction of those 4k owners. So I do concede it would probably make some difference at 4k if you insist on all high settings but at 1440p and below, you will be hard pressed to notice the difference.
 
I'll keep my 1080ti @ 1440, fills all my needs. If I decide to go 4k, then I'll revisit a 2080 ti or maybe 2070 super, depends on cost at that time.

Edit: or may go with an amd gpu if I go with an amd x5709 mobo. depends on if amd comes out with a card to match the 2080 super/ti. I'd really like to go with one mfr for gpu & cpu.
 
Last edited:
I bet even most people buying 2080 Ti don't use it for 4k gaming since the majority of gamers are at 1080p. Even if you make the argument that 2080 Ti targets 4k, it would still only be a fraction of those 4k owners. So I do concede it would probably make some difference at 4k if you insist on all high settings but at 1440p and below, you will be hard pressed to notice the difference.

I think we are definitely in diminishing returns area. I always figured that's one reason nVidia pushed ray tracing over a straight rasterized card. I think most people are at the point that upgrading isn't necessary without some jazzy new feature and nVidia probably realized the same.
 
I think we are definitely in diminishing returns area. I always figured that's one reason nVidia pushed ray tracing over a straight rasterized card. I think most people are at the point that upgrading isn't necessary without some jazzy new feature and nVidia probably realized the same.

4K/120 will move the performance targets, eventually.
 
Had three watercooled 1080ti's during the mining craze. Mining paid them off and then when that went away I sold one so now down to 2 in SLI. See no reason to upgrade until hdmi 2.1 cards come out and the monitors that go with them.
 
  • Like
Reactions: drklu
like this
Yeah, my answer is a total non sequitur to the question in the OP.

Perhaps your definitive claim of "not worth it" is hardly as definitive as you think.

If you went from 2080 Ti to Titan RTX for gaming then you either won the lottery or just enjoy burning money. That doesn't mean my point doesn't stand, 99.9% of people don't do insane "upgrades" like that unless it's for productivity reasons.
 
Last edited:
I went from a Titan X(p) to 2080Ti, did it since I play at 4K, overall I'm satisfied with the performance I was able to play everything in the last 15 months since I got it at 4K/60fps. I'll probably pick up a 2180Ti if it is priced right at launch, if not I might wait to see what AMD and Intel offer.
 
I upgraded to a 2080 Ti from a Titan X the day it came out. The performance boost at 4K was well worth it. Ray tracing is just an added bonus.
Yeah... and the 2080ti has a lot less overclocking headroom than the 1080ti. So maybe that 45% "best case" scenario is more like 35-40%. It's not a terrible gain, it's the price that turned everyone off.

I don't regret buying one though - I've had it almost two years and I had a 10% coupon, so $1080 - $500 for my 1080ti = $580 over two years. It's a cheap hobby. I finally have a IQ I am happy with and it fits nicely with the Vive Pro for VR. I figure I'll skip Ampere and let Intel/AMD/nVidia fight it out for a while. If MCM is a thing in 2020, I want to skip the first generation of it.
Overclocking headroom between the 1080 Ti and 2080 Ti is about the same. Both can go from 1.5 GHz to 2.1 GHz, or a 40% increase. Makes sense, since both are on 16nm. My Titan X could actually only hold about 1.9 GHz while my 2080 Ti is close to 2.1 GHz most of the time. Both have/had AIO coolers attached.
 
I kept my 1080ti, and I plan to upgrade when the next gen comes out. That way I can give my 1080ti to my wife! Win/Win for the both of us laugh.

I built a rig for my nephews and put my Titan Xp in that, so I bought a 2080Ti for myself.

I wasn't going to upgrade my video card, I built a new X299 rig and planned on using the Xp. I hadn't planned to build them another rig so I already had my loop with the Xp/EK block built, then pulled it out and put in a 2080Ti w/Heatkiller block, so had to drain my loop and bend a couple of new tubes. My Xp was a stout card, I shunt modded the power limits off of it and it was a killer. My 2080Ti is vanilla Founders Edition and ain't much of an OCer, but it seems to handle what I toss its way. I seriously doubt I'll be replacing the 2080Ti with anything for a while.
 
If you went from 2080 Ti to Titan RTX for gaming then you either won the lottery or just enjoy burning money. That doesn't mean my point doesn't stand, 99.9% of people don't do insane "upgrades" like that unless it's for productivity reasons.

Are you more comfortable with the location of the goal posts now that you have moved them somewhere new? The OP does not mention gaming. Your "definitive" answer talks about a tiny niche within FPS gaming which itself is a niche within the submarket of gaming. For your 0.001% niche, the 1080Ti isn't worth it either because improved image fidelity is counterproductive. For everybody else, the 2080 Ti is a massive improvement.
 
Back
Top