I just bought a 1080ti, I'm going to hold out till 2019 for my next GPU, who's with me??

Which has fuckall to do with the discussion at hand because...?

It's a thread I was replying to another member not you before you jump in off topic and now you point the blame on me!
 
Last edited:
None of that is relevant at all other than it took a few additional years for AMD to make their big moves in the GPU market (hence my earlier post).

It seems though you were debating about IdiotInCharge's post about "AMD still hasn't caught up" and for reasons unknown, you choose to use their founded years as some sort of metric in your rebuttal.

Anything AMD had prior to ATI acquisition in 2006 is pretty much meaningless in the argument of AMD vs Nvidia performance or who is ahead.

If you think having some sort of seniority is valid, then it would seem though in this case it would be nvidia being the senior.

You have your opinion and I have mine, sometimes threads go off topic so what!
 
Last edited:
Which has fuckall to do with the discussion at hand because...?

It's a thread I was replying to another member not you before you jump in off topic and now you point the blame on me!

"At hand" would mean very close or in this case, the topic at this very moment.

Meaning how is your post about AMD's quad core and nvidia fallout deal has to do with the discussion going on at the very moment (pretty much between IdiotInCharge, you, and me).

You have your opinion and I have mine, sometimes threads go off topic so what!

You really can't admit how much of your argument is invalid.

If a company like Ford Motors decides to acquire AMD in 2006, then there is no way that Ford can behind Intel. Ford was founded in 1903 and Intel wasn't founded til 1968.

If you are going to try to cherry-pick little details for a rebuttal, it needs to have some sort of remote value in the argument you are making.

Trying to use AMD's quad core and the fallout deal with nvidia seems like you are trying to gain the sympathy card for AMD.

Again, that holds no value at all in the argument of "who is the 1st 4K gaming videocard" nor "is AMD caught up or not to Nvidia".
 
Last edited:
"At hand" would mean very close or in this case, the topic at this very moment.

Meaning how is your post about AMD's quad core and nvidia fallout deal has to do with the discussion going on at the very moment (pretty much between IdiotInCharge, you, and me).

You really can't admit how much of your argument is invalid.

If a company like Ford Motors decides to acquire AMD in 2006, then there is no way that Ford can behind Intel. Ford was founded in 1903 and Intel wasn't founded til 1968.

If you are going to try to cherry-pick little details for a rebuttal, it needs to have some sort of remote value in the argument you are making.

Trying to use AMD's quad core and the fallout deal with nvidia seems like you are trying to gain the sympathy card for AMD.

Again, that holds no value at all in the argument of "who is the 1st 4K gaming videocard" nor "is AMD caught up or not to Nvidia".

When did you become Staff Member!
 
When did you become Staff Member!

You don't need to be a staff member to make an intelligent debate.

Though trying to string a simple one with you is proving to be a bit difficult.
 
Folks, this is why you shouldn't drink and post.

But it's my birthday and it's football Sunday. Gano clutched and broke his record with a gaming winning field goal.

Don't I get a freebie? ;)
 
You don't need to be a staff member to make an intelligent debate.

Though trying to string a simple one with you is proving to be a bit difficult.

this is not the Thread to have this debate on I have now got bored with it lets get back to what I ask another member before IdiotInCharge started this argument off:)
 
this is not the Thread to have this debate on I have now got bored with it lets get back to what I ask another member before IdiotInCharge started this argument off:)

If you want to avoid side debates then don't bring in meaningless details in the future.

As an unofficial staff member that has been ordained by you, I'll let you off the hook now.
 
Last edited:
Please don't tell me how to write any debate in the future, you have your opinion I have mine.;)
 
Last edited:
Please don't tell me how to write any debate in the future, you have your opinion I have mine.

Debates where delusional opinions > actual facts.

Yea I see there is no point in continuing this further.

Perhaps you also believe that going out and teaching a pet hamster to make sushi will revive the almighty Chaintech. It is totally relevant because of skittles.
 
I will say, the 1080 isn't enough for 4K for the money, and I'm not a FPS whore or anything, or a serious gamer even by any reasonable standard.
It was an upgrade from a 1060 and it's "okay", but not enough performance at 4K for the money. What I am going to do, instead of buying a TI or a
2080, is move down to 1440p for half the money, where I think a plain 1080 will be plenty. Maybe that'll be the new trend, lower resolution instead of spending
$700+ on a stronger GPU lol...

(yes I knew the 1080 was marginal at 4K, it was basically that or nothing though)
 
After debating what to do for the last couple weeks I finally picked up a used 1080 Ti. It's certainly not the performance jump I had looked forward to coming from my 1080, but it's the only thing that remotely makes sense right now, if you want to call it that. The RTX cards offer pathetic generational performance increases, period, and even more so when you consider the absurd pricing. I'll just run this 1080 Ti until 7 nm GPUs hit.

Certainly not the outcome I had looked forward to all summer, but it will be a decent jump in performance. The reality is that it's just an awful time to be in the market for a new GPU.
 
After debating what to do for the last couple weeks I finally picked up a used 1080 Ti. It's certainly not the performance jump I had looked forward to coming from my 1080, but it's the only thing that remotely makes sense right now, if you want to call it that. The RTX cards offer pathetic generational performance increases, period, and even more so when you consider the absurd pricing. I'll just run this 1080 Ti until 7 nm GPUs hit.

Certainly not the outcome I had looked forward to all summer, but it will be a decent jump in performance. The reality is that it's just an awful time to be in the market for a new GPU.

and what was exactly the jump in performance you were expecting, we are talking on numbers that go above the 40% mark up to 50% in some cases.. from 1080ti to 2080ti, thats a pretty large jump in performance if you ask me.
 
and what was exactly the jump in performance you were expecting, we are talking on numbers that go above the 40% mark up to 50% in some cases.. from 1080ti to 2080ti, thats a pretty large jump in performance if you ask me.

780 Ti -> 980 Ti = 40% to 60% improvement, $50 decrease in launch price
980 Ti -> 1080 Ti = 70%+ improvement, $50 increase in launch price
1080 Ti -> 2080 Ti = ~35% improvement, $500 increase in launch price

To answer your question, I would expect a 100%+ performance increase for the prices they are asking and the way they are marketing this GPU, comparing it as revolutionary like the 8800 GTX. For the performance it truly offers, I would expect the $650 price tag like Maxwell. Maxwell itself was very lack luster in terms of a generational performance gain compared to what we typically see when comparing Fermi to Kepler, Kepler to Maxwell, and Maxwell to Pascal. Turing can't even match Maxwell. It's beyond pathetic for the pricing.

I'm not butt hurt about it. It is what it is. The only thing I can do is vote with my wallet. I could certainly afford a pair of 2080 Ti's if I wanted them. However, no matter how much money I have at my disposal, I'm not going to just be blatantly ripped off like that.
 
This is what happens when competition dwindles. I went 1080ti just recently as well. When AMD dropped the RX 480, nvidia came back to trounce it. Without an AMD card to hold the performance in check, we get slower improvements. It'd be interesting to chart this over time.
 
This is what happens when competition dwindles. I went 1080ti just recently as well. When AMD dropped the RX 480, nvidia came back to trounce it. Without an AMD card to hold the performance in check, we get slower improvements. It'd be interesting to chart this over time.

The lack of competition is certainly the main driver to this. The sad thing is we are talking about the Kepler to Pascal periods, not exactly the golden days of GPU competition themselves.
 
780 Ti -> 980 Ti = 40% to 60% improvement, $50 decrease in launch price
980 Ti -> 1080 Ti = 70%+ improvement, $50 increase in launch price
1080 Ti -> 2080 Ti = ~35% improvement, $500 increase in launch price

To answer your question, I would expect a 100%+ performance increase for the prices they are asking and the way they are marketing this GPU, comparing it as revolutionary like the 8800 GTX. For the performance it truly offers, I would expect the $650 price tag like Maxwell. Maxwell itself was very lack luster in terms of a generational performance gain compared to what we typically see when comparing Fermi to Kepler, Kepler to Maxwell, and Maxwell to Pascal. Turing can't even match Maxwell. It's beyond pathetic for the pricing.

I'm not butt hurt about it. It is what it is. The only thing I can do is vote with my wallet. I could certainly afford a pair of 2080 Ti's if I wanted them. However, no matter how much money I have at my disposal, I'm not going to just be blatantly ripped off like that.

I only care about ray tracing.

1080ti -> 2080ti = 600% for 70% more money.

Or is it 0%... ;)

I guess time will tell.
 
I only care about ray tracing.

1080ti -> 2080ti = 600% for 70% more money.

Or is it 0%... ;)

I guess time will tell.

Hmmm, I would lean more towards the 0% since there isn't a single game out there that supports it. When there is, performance will be so bad it's not worth talking about anyways.
 
Hmmm, I would lean more towards the 0% since there isn't a single game out there that supports it. When there is, performance will be so bad it's not worth talking about anyways.

We only have one prophet and that’s KazeoHin.

We’ve already seen 4k games at high fps and RT.
 
We only have one prophet and that’s KazeoHin.

We’ve already seen 4k games at high fps and RT.

Which games are those? I'm not doubting you, I just haven't seen them yet I guess.

From everything I've read, the 2080 Ti still isn't a viable single GPU solution to 4k 144 without RT.

If you are interested in Ray Tracing, that's great and these cards are for you. From my understanding 1080P 60 FPS is the target for the 2080 Ti and Ray Tracing. I'm personally not interested in 1080P or 60 FPS by themselves, let alone the two combined. For me, Ray Tracing right now has about as much relevance as quantum computing.

By the time there are enough titles that support it, we should have 7 nm GPUs with much more viable performance for it. The only game I've heard of that's supporting Ray Tracing any time soon that interests me is Battlefield V. No way I'm going back to sub 144 FPS for an online first person shooter.
 
Which games are those? I'm not doubting you, I just haven't seen them yet I guess.

From everything I've read, the 2080 Ti still isn't a viable single GPU solution to 4k 144 without RT.

If you are interested in Ray Tracing, that's great and these cards are for you. From my understanding 1080P 60 FPS is the target for the 2080 Ti and Ray Tracing. I'm personally not interested in 1080P or 60 FPS by themselves, let alone the two combined. For me, Ray Tracing right now has about as much relevance as quantum computing.

By the time there are enough titles that support it, we should have 7 nm GPUs with much more viable performance for it. The only game I've heard of that's supporting Ray Tracing any time soon that interests me is Battlefield V. No way I'm going back to sub 144 FPS for an online first person shooter.

I can’t remember the name of it. Factum! Where’s Factum!?

I cannot blame anyone from being pessimistic.

The only reason I really care about the 2080ti is powering VR and DLSS2X working would be a bonus. Same with Ray Tracing, it’d be nice but not part of my decision.

I think there’s a decent possibility DICE will surprise is with RT. About a month ago they said 50-65 FPS 1440p was doable. And they talked about splitting rasterized resolution ans rt.
 
I am still using my 1080.
Nvidia knows they can do whatever they want.
Would love to see amd do something but I am not holding my breath.
 
I can’t remember the name of it. Factum! Where’s Factum!?

I cannot blame anyone from being pessimistic.

The only reason I really care about the 2080ti is powering VR and DLSS2X working would be a bonus. Same with Ray Tracing, it’d be nice but not part of my decision.

I think there’s a decent possibility DICE will surprise is with RT. About a month ago they said 50-65 FPS 1440p was doable. And they talked about splitting rasterized resolution ans rt.

I'm really not trying to be pessimistic. I'm just calling it like I see it with the information that's available to us today. I would love nothing more than Ray Tracing to have great performance and software adoption and have a real reason to spend $1200 on this card.

The fact is, Ray Tracing in the wild is not available today. What little information we've seen about it's expected performance showed SOTR running from 30-70 FPS at 1080P. Granted that is an early game build on early drivers with a yet to be released at the time GPU. Maybe performance will be much better when it's actually launched.

My personal preferences just lean towards at least 1440P and 144 hz, and unless something changes drastically, it just doesn't appear the 2080 Ti is going to be able to offer that with RT. I had high hopes that the 2080 Ti could push 4K 144 with a single GPU (no RT), but by everything I've read, it cannot. So for me, the cards just serve no practical purpose unless something really unforeseen happens. If it does, my mind will change and I'll buy one for sure. I just have a suspicion we'll have proper Turing GPUs on 7 nm before that happens.
 
I'm really not trying to be pessimistic. I'm just calling it like I see it with the information that's available to us today. I would love nothing more than Ray Tracing to have great performance and software adoption and have a real reason to spend $1200 on this card.

The fact is, Ray Tracing in the wild is not available today. What little information we've seen about it's expected performance showed SOTR running from 30-70 FPS at 1080P. Granted that is an early game build on early drivers with a yet to be released at the time GPU. Maybe performance will be much better when it's actually launched.

My personal preferences just lean towards at least 1440P and 144 hz, and unless something changes drastically, it just doesn't appear the 2080 Ti is going to be able to offer that with RT. I had high hopes that the 2080 Ti could push 4K 144 with a single GPU (no RT), but by everything I've read, it cannot. So for me, the cards just serve no practical purpose unless something really unforeseen happens. If it does, my mind will change and I'll buy one for sure. I just have a suspicion we'll have proper Turing GPUs on 7 nm before that happens.

There’s plenty of videos showing RT in a better light. That one was a few days after developers ever touched a RTX card. Watch the digital foundries DICE interview but even that is very old now. But I’ve wasted enough time on this topic. Like I said, time will tell.
 
I pick up a used MSI Gaming X 1080ti off ebay a few weeks ago for $550 including shipping. Plus I have a 43in 4k monitor coming this week. Gonna sit on this setup for awhile.
 
I'm thinking of waiting until amd has a worthy gpu to replace my 1080ti (which I bought at launch), kinda sick of the way nVidia is treating their customer base.
In any case I'm skipping the 20x0 generation for the foreseeable future.
While I'd love to upgrade to 4k I can't/don't want to spend the money on a nice monitor with gsync + a 2080ti.
 
...by releasing innovative, market-leading products?

They make some amazing product for sure and they do lead the market. Can't argue with that.
Sick might have been too strong a word. But I do plan to vote with my wallet. Even if that's not gonna matter at all to them.
 
They make some amazing product for sure and they do lead the market. Can't argue with that.
Sick might have been too strong a word. But I do plan to vote with my wallet. Even if that's not gonna matter at all to them.

With respect to the pricing, I agree- it's mostly that performance/price doesn't make sense coming from a 1080Ti until you try to push 4k/VR. Had they dropped pricing on their whole lineup and introduced the 2080Ti at 1080Ti prices, maybe...

As for the other crap, well, I vote with my wallet too, and I vote for innovation :).

AMD has innovated and absolutely can innovate today, but until they do, their solutions are of limited appeal at the high-end. Midrange I'm fine with, got an RX560 to drive additional displays right next to my 1080Ti!
 
With respect to the pricing, I agree- it's mostly that performance/price doesn't make sense coming from a 1080Ti until you try to push 4k/VR. Had they dropped pricing on their whole lineup and introduced the 2080Ti at 1080Ti prices, maybe...

As for the other crap, well, I vote with my wallet too, and I vote for innovation :).

AMD has innovated and absolutely can innovate today, but until they do, their solutions are of limited appeal at the high-end. Midrange I'm fine with, got an RX560 to drive additional displays right next to my 1080Ti!

I tought the value was better then expected on a 1080ti so I got one. Initially I was gonna pass on it but changed my mind once it was released.
Right now, even if I had an older gpu I doubt I'd be interested at those prices.
Where things will get interesting is when the 3080ti will be released in ~2 years. By that point I'm sure I'll be a whole lot more interested in nVidia. Assuming AMD will not catch up in the high end segment.
Anyone with a 1080ti can wait it out imo.
 
I put my most recent build together a couple of years ago with a 4K monitor. The 4K monitor has been awesome for general computer use, however when I built the computer there were no 4K capable cards available, so I picked up a GTX 960 and waited. The 1080 Ti was 4K capable, but the cryptomining boom sent prices through the roof. Now we have a new generation of cards, and prices are even higher.

I think I'm going to break down and hand NVidia some cash for a 2080 or 2080 Ti. If I don't, my computer will be 3-4 years old by the time the next generation of cards is released and/or AMD offers some competition to drive prices down. I'm pretty annoyed by the obnoxious prices of these cards, but I can't sit around waiting forever!
 
I put my most recent build together a couple of years ago with a 4K monitor. The 4K monitor has been awesome for general computer use, however when I built the computer there were no 4K capable cards available, so I picked up a GTX 960 and waited. The 1080 Ti was 4K capable, but the cryptomining boom sent prices through the roof. Now we have a new generation of cards, and prices are even higher.

I think I'm going to break down and hand NVidia some cash for a 2080 or 2080 Ti. If I don't, my computer will be 3-4 years old by the time the next generation of cards is released and/or AMD offers some competition to drive prices down. I'm pretty annoyed by the obnoxious prices of these cards, but I can't sit around waiting forever!

You pretty much locked yourself into that fate by buying a 4k monitor. Kind of hard to complain about high end prices when you knew you were consigning yourself to that fate at the time. A 2080 is no better then a 1080Ti when it comes to 4k so I would save a few bucks and just get a 1080Ti. why didnt you buy a 1080Ti when they came out?
 
A 2080 is no better then a 1080Ti when it comes to 4k so I would save a few bucks and just get a 1080Ti.

That said and agreed, a 1080Ti/2080 is just about bare-minimum for 4k. Like regularly dropping below 60FPS bare minimum. I'd recommend stretching to the 2080Ti if possible, if you plan on keeping the GPU for a while.

Now, if you're not against upgrading in the future, the 1080Ti makes more sense; prices on the 2000-series should (my prediction, take with salt!) start dropping in price as 1000-series stock dries up and 2000-series production catches demand. I have no idea when that will happen, and it may not happen till the 2000-series replacement comes along; in that time, AMD might bring in some competition, or even Intel, but it's something to ponder.
 
What pisses me off is when a single GPU cost the same as a complete PC computer. Prices for just about every other component has not increased that dramatically.

Nvidia has been the lead innovations, but also the blame for current pricing.
 
What pisses me off is when a single GPU cost the same as a complete PC computer. Prices for just about every other component has not increased that dramatically.

Nvidia has been the lead innovations, but also the blame for current pricing.

Eh, if you want to play well at 1080p60, your price isn't that high. If you want to play at 1440p120+, you're going to pay a little more; 4k120, well, you get the picture.

I don't see mainstream gaming as being less accessible due to the current prices of the highest-end GPUs. If anything, it's become more accessible, mining craze notwithstanding, outside of the insane cost for memory. Perhaps Intel's 10nm setbacks have contributed to that a bit as well.
 
Back
Top