opinion? RTX ti or stay with 1080ti?

tangoseal

[H]F Junkie
Joined
Dec 18, 2010
Messages
9,743
I've read the reviews... cant make my mind up.

Do you guys think I should just stick with the 1080ti under water right now or get an RTX ti?

I play @ 240hz 1080p and 100hz 3440x1440 both gsync. ..

I also do ALOT of video transcoding/rendering work mostly CPU but sometimes I use my GPU for really really long and big jobs.

I wonder if the video engine on the RTX is bigger and faster than the GTX cards?

Anyways what do you think, worth it? No?
 
I'd go no for now. I've done the switch and have not been impressed enough to even update my sig. I don't think at 1080p you'd see a marked improvement, I am not noticing a marked improvement at 1440 despite the preview here at the [H] showing some improvement (I think an OC 1080 ti vs a rtx 2080 ti, the gap is even smaller)
 
If you were upgrading from something like a 570 or a 980 I'd say yes. But There's just not a big enough increase in performance from a 1080TI for the price IMO. I'm having this debate right now, (Currently on an OC'ed 970 FTW) And it's STILL a tough pill to swallow
 
Would be almost no gain for you at 1080p as it will just bottleneck. You would see some gain at your other resolution but I dont think it would be a big enough gain for you to really care for the price you will pay for it.
 
Many thanks.... your opinions helped me consider. I am not going to go RTX. I will wait until AMD released the next big thing or nVidia ... whatever... 1080ti is already very fast and I am very happy with the card to be honest.
 
Last edited:
Many thanks.... your opinions helped me consider. I am not going to go RTX. I will wait until AMD released the next big thing or nVidia ... whatever... 1080p is already very fast and I am very happy with the card to be honest.


I think that's a sound decission
 
I would try and find data on your specific applications. Certain games/circumstances the 2080ti can be ~45% faster in. I also read it’s significantly better at some professional workloads.

I do agree with the 1080p comments however.
 
Referencing Slade's sig above, with a 2950X you're likely to be further CPU-bound at 1080p with lower per-core performance, assuming that you could use any more performance at 1080p in the first place. That's intensely application-specific.

At 3440x1440, not nearly as much CPU limitation, so potentially a 2080Ti could get you higher settings usage. I'll say that my 1080Ti with a hybrid cooler (>2000MHz core sustained under load) isn't fast enough at 2560x1440 for everything at higher/est settings and benchmarks bear that out, but I'd have a hard time making the 'value' argument for it for myself. I'd have a hard time making that argument for someone with a 1080Ti if 2080Tis sold for the same price, much less significantly more.

So generally speaking, and agreeing with the above, not recommended right now.
 
I think you should buy two 2080 Ti's because if you buy more, you save more.

I had sli 1080ti but sold one during the hypapalooza mining insanity.

It was impressive in some but not the bulk of titles.

Wht interest me more also about a 2080ti is the Video Engine segment of the GPU silicon. I do a lot of h265 work and even on my 2950x it takes a while. Cuda hauls ass.

Idiotincharge... I play 1080p because I have a 240hz monitor. I also play @ 3440.

The 2950x surprisingly isn't as weak as many, maybe you, believe. It's actually a very strong contender not far behind Intels high end hedt single core. 9900k though is a different story.
 
I came to the same conclusion. It’s not worth the price for the performance you get. And I’m not buying into this half baked ray tracing bullshit.
 
I came to the same conclusion. It’s not worth the price for the performance you get. And I’m not buying into this half baked ray tracing bullshit.

It's just early days on the tech, without enough oomph - and Nvidia needed to make some money off it at some point. It's just too bad that it was coupled with mediocre performance gains with an insane price increase.
 
Did everyone miss that the OP is running 1080p @ 240Hz? Even at 1080p, high refresh can be demanding on the GPU.
 
Did everyone miss that the OP is running 1080p @ 240Hz? Even at 1080p, high refresh can be demanding on the GPU.

Cause this is why

Screen-Shot-2018-09-18-at-8.50.13-PM.png
 
good lawdy man ... there NO point to getting an RTX card over my 1080ti. I do not own nor want 4k right now until it matures even more and is more affordable.

I def I am staying with the 1080ti .. wtf was nvidia thinking?

They are banking on Ray Tracing really.

Well its already working in the professional market, but they are making a big push in the consumer market too. Its a win/win for them if they pull it off with the same die. I personally am not upgrading until I see RT first and how these perform with them, until then I can game at 1080/1440 just fine on my 1080 too.
 
On the fence myself, have a titan x 2016 version..really all depends on Battlefield V performance for me. Beta and Alpha ran ok but under 60fps most of the time in 4k, though there was apparently some AA bug which would impact performance as well. So holding out to see how the final game runs.

I've actually taken delivery of an msi duke and sent it back after mulling it over (unopened obviously), also refused delivery of an evga xc ti due to changing my mind in the time it took to arrive. :oops:

The performance bump (around 20ish fps in 4k) vs price is the thing that really bugs me, just seems like i'd be wanking away money for performance i could gain by dropping some detail settings.
 
The performance bump (around 20ish fps in 4k) vs price is the thing that really bugs me, just seems like i'd be wanking away money for performance i could gain by dropping some detail settings.

This is true almost all of the time elsewhere too :)
 
Even if you’re at 1080p and CPU bottlenecked you can use more GPU by turning on DSR (or VSR for AMD) which makes 1080p look wayyy better. That could be useful on your 1080ti even.

I don’t think you ever actually told us your specific use cases.
 
This is true almost all of the time elsewhere too :)


It is but this time around the cards are a lot more expensive than previous TI models, so it's more of an investment for an "ok" fps bump. Wouldn't be quite as bad if the cards were a couple of hundred cheaper but that's a while off.
 
It is but this time around the cards are a lot more expensive than previous TI models, so it's more of an investment for an "ok" fps bump. Wouldn't be quite as bad if the cards were a couple of hundred cheaper but that's a while off.

don't forget the "three slot" bit.

i mean, yeah i get it's powerful, but i thought two slots was pushing it when the fx 5800 introduced its cooler to the market-at-large at the time, especially when the 9700/9800 pro were single slot and equivalent performance.

three is insane. it just looks weird.

i get we're in an era where maybe we don't have as many peripherals as we used to, but the extra slot makes all the difference in the world if you want to add more than a soundcard+misc pci-e device (i.e. twoish PCI-e cards) if you ask me.
 
It is but this time around the cards are a lot more expensive than previous TI models, so it's more of an investment for an "ok" fps bump. Wouldn't be quite as bad if the cards were a couple of hundred cheaper but that's a while off.

I've regularly advocated that the 2080Ti doesn't make sense over a 2080 or 1080Ti unless the performance of those cards simply isn't 'enough', in which case the bump in performance may be worth it. So I'm not disagreeing here- it's a lot of money, up where the more special and expensive to build parts have lived, but at the same time, it's not priced like a Quadro- it's moderately accessible. Which, for top-end performance, doesn't make it necessarily unreasonable.
 
three is insane. it just looks weird.

Why?

Sure, if you have stuff to put in those slots it can be inconvenient, but another slot to keep boost clocks under load while keeping the noise down? I don't see it as a bad compromise.
 
Why?

Sure, if you have stuff to put in those slots it can be inconvenient, but another slot to keep boost clocks under load while keeping the noise down? I don't see it as a bad compromise.

for sure man.

it's just the traditionalist in me; when i was younger, it seemed everyone who was 5-10 years my senior was all about traditionalism.

i have been "out of the game" in terms of goings-on in the industry. i only learned in the past month that SLI is not what it used to be, so maybe 2080Ti x 2 is not beneficial and thus a single card for three slots isn't so bad.

i am just hoping this doesn't normalise three slot cooling in their typical premium segment
  • the "premium segment" i'm referring to is more applicable to the time when, say, the 980/1080s came out. i don't want that price point [not the 2080Ti/2080 price point] having three slot solutions.
just my opinion, of course.
 
i am just hoping this doesn't normalise three slot cooling in their typical premium segment

Well, it's a compromise- you still have dual-slot open-air coolers and closed blowers, but perhaps the better alternatives are the hybrid solutions that use a low-speed blower and a closed-loop-cooler to quietly and efficiently keep the card cool at boost speeds.
 
As stated, I've done the switch to a 2080ti from a 1080ti and in gaming, it isn't even noticeable for 1440p, maybe on a 3k 1440p, it may edge a 1080ti, but not by much. My curiosity for the 2080ti is for the tensorcores which ties me back to my work on machine learning, basically opening up applications I have working on volta working on RTX.
 
Back
Top