Nvidia Turing: Time to Pay Respect.

If still the same price today, what would you rather purchase at a similiar reasonable price?

  • GTX 1080ti

  • RTX 2080

  • Vega 64 Liquid


Results are only viewable after voting.

Nightfire

2[H]4U
Joined
Sep 7, 2017
Messages
3,279
From the release of the 20 series, which saw nothing in advancing price/performance to rumors of a $500 RTX 3070 matching the performance of a $1000+ RTX 2080ti, Turing has been the butt of many jokes around here. However, it is one of the most important releases in recent gpu history. Lets take a look:

Performance: $/fps saw no real improvements but with DLSS, it may see even a bigger gain than Pascal did over Maxwell. Software has taken it's sweet time on this, but we are finally starting to get there. Without DLSS, 4k is still years away for many games on even the best of hardware.

RTX and other features: Pushing more and more pixels is cool and all, but the new features that Turing gave are important in advancing the market.

Efficiency: Despite being a 'fake die shrink, going from Pascal to Turing saw much better efficiency improvements than Turing to Ampere which uses 8nm. Yeah, I know efficiency isn't everything, but there is a ceiling on how much power you can put in a card. If we only see 10% improvement generation to generation, we don't have very far to go.

That's about all I have to say about that. Turing was about taking one step back to go two steps forward. So to all you bullies out there, LEAVE TURING ALONE!
 
Turing.jpg
 

Attachments

  • The-Imitation-Game-012.jpg
    The-Imitation-Game-012.jpg
    15.8 KB · Views: 0
Two years ago I wasn't sure whether to go 1080ti or 2080, as the performance was about the same.

Now I recently found out that the game that I sink 3000+ minutes per week into is about to start using Ray Tracing. That means, not only have I got 2 good years out of my 2080 so far, but it's going to continue to pay off. No regrets here.
 
Had turning been priced more reasonably i would have been all over it. It was not so i did not buy.

Nvidia being Ngreadia I suppose, but much of the blame here goes to AMD as well. Also, new tech surely is not cheap.
 
It didn't make sense to me to pay $700 for a card that was marginally better than a 3+ year old 1080 Ti.
 
  • Like
Reactions: Parja
like this
You act like you would actually have had a chance to buy one.
Lol

Actually had the chance to buy a 2080ti? there has been plenty of opportunities so not sure what you are saying? I would have had to pay over $2000 Canadian for the privilege which would have gotten me ~ 30% more speed than my 1080ti.
Now with Amphere I can get 30% more performance than a 2080ti but instead of $2000 CDN it cost me $1050 CDN for the card that I purchased on launch day. Basically what i am saying is model numbers aside 30% more performance would have been OK with me, I just wasn't willing to pay almost twice the money for the privilege. Another way of looking at it is 3 year later after buying a 1080ti I am getting a card (3080) that is 70-90% faster for the same price! Seems pretty good to me.
Now as for having purchased a 3080..... I did get an order confirmation from Amazon but 2 days later they still have not given me any kind of shipping date.... Kind of feel like they sold me a pre order and just didn't mark it as such... hopefully I won't have to wait to long.
 
Had turning been priced more reasonably i would have been all over it. It was not so i did not buy.

It's true the price jump wasn't really warranted when Turing came out. If you were sitting on a high end 10 series, there wasn't much of a reason to upgrade. For the popular GTX 1080, only the RTX 2080 Ti was an actual upgrade. However, the performance increase didn't seem reasonable given the cost increase. Having said that, I think the RTX 2080 Ti was worth it for me. I was on a 4K display at the time and a single RTX 1080 Ti just didn't cut it. I actually had two in SLI, but we know how useless that is now.
 
can we pay respects to Pascal as it seems more popular then Turing...

I think Pascal will roll on another year or 2 in the used market for kid/spare/etc builds.

You can make min spec without knowing anything on a budget.
 
I'll also remember "Turing", as in Alan Turing, as in clearly the series meant to be a Crypto mining beast, but then that didn't magically pan out, so suddenly they made it all about Ray Tracing. Alan's other work he was famous for. :confused:
 
Funny thing is, with my 2070 super xc ultra, I get 100+ fps in my games, that I play at 1440p. I really am debating now whether to upgrade at all. I mean I'd like a 3080 but is it really necessary? More to the point, I'm totally not in a rush to upgrade at this point. Especially with whats going on now.
 
can we pay respects to Pascal as it seems more popular then Turing...

I wonder how many times more popular? As in how many Turning chips were sold compared to Pascal?

I'm going to be conservative and go with 4X... :shifty:
 
Funny thing is, with my 2070 super xc ultra, I get 100+ fps in my games, that I play at 1440p. I really am debating now whether to upgrade at all. I mean I'd like a 3080 but is it really necessary? More to the point, I'm totally not in a rush to upgrade at this point. Especially with whats going on now.
In one word, NO. Avoid the super hype and stay with what you have that more than meets your needs.
 
From the release of the 20 series, which saw nothing in advancing price/performance to rumors of a $500 RTX 3070 matching the performance of a $1000+ RTX 2080ti, Turing has been the butt of many jokes around here. However, it is one of the most important releases in recent gpu history. Lets take a look:

Performance: $/fps saw no real improvements but with DLSS, it may see even a bigger gain than Pascal did over Maxwell. Software has taken it's sweet time on this, but we are finally starting to get there. Without DLSS, 4k is still years away for many games on even the best of hardware.

RTX and other features: Pushing more and more pixels is cool and all, but the new features that Turing gave are important in advancing the market.

Efficiency: Despite being a 'fake die shrink, going from Pascal to Turing saw much better efficiency improvements than Turing to Ampere which uses 8nm. Yeah, I know efficiency isn't everything, but there is a ceiling on how much power you can put in a card. If we only see 10% improvement generation to generation, we don't have very far to go.

That's about all I have to say about that. Turing was about taking one step back to go two steps forward. So to all you bullies out there, LEAVE TURING ALONE!

Turing was probably the WORST Nvidia release this decade. I'll break it down like you did:

Performance: Only 1 card at launch was as powerful as the previous gen, and it was $1200. When DLSS is available in 75% of games AT LAUNCH, maybe you can factor it in. But when it is NOT available for the VAST MAJORITY of games, it isn't a mainstream feature.

RTX and other features: See the breakdown of DLSS above, and the argument is the same. Not to mention the extreme performance penalty when it is enabled. We are two years after the RTX launch and still only a handful of releases enable RT and most of those aren't on release. We can talk about Cyberpunk, but it's STILL not released yet. Cyberpunk is not the savior of Turing like we've been hearing for a year.

Efficiency: Your argument is basically an indictment of Ampere which just uses more power to get higher framerates. You would expect more newer architectures to be more efficient. So you should expect Turing to be more efficient than Pascal. Just because Ampere is a power hog doesn't mean Turing was great. Honestly, the move from Vega to Navi was more impressive in terms of efficiency. Vega went from a power hog, and a 5700XT draws maybe 10W more at load than a 2070 Super. If AMD is to be believed (a big if), Big Navi should be significantly more efficient than Ampere (and likely the performance will suffer).

IMO, Turing was two steps back to go 1 step forward. If I could go back in time and tell myself which card to buy, I would tell myself to buy a 1080Ti at launch and use it for 4 years. That's how bad Turing was.
 
  • Like
Reactions: noko
like this
Turing was probably the WORST Nvidia release this decade. I'll break it down like you did:

Performance: Only 1 card at launch was as powerful as the previous gen, and it was $1200. When DLSS is available in 75% of games AT LAUNCH, maybe you can factor it in. But when it is NOT available for the VAST MAJORITY of games, it isn't a mainstream feature.

RTX and other features: See the breakdown of DLSS above, and the argument is the same. Not to mention the extreme performance penalty when it is enabled. We are two years after the RTX launch and still only a handful of releases enable RT and most of those aren't on release. We can talk about Cyberpunk, but it's STILL not released yet. Cyberpunk is not the savior of Turing like we've been hearing for a year.

Efficiency: Your argument is basically an indictment of Ampere which just uses more power to get higher framerates. You would expect more newer architectures to be more efficient. So you should expect Turing to be more efficient than Pascal. Just because Ampere is a power hog doesn't mean Turing was great. Honestly, the move from Vega to Navi was more impressive in terms of efficiency. Vega went from a power hog, and a 5700XT draws maybe 10W more at load than a 2070 Super. If AMD is to be believed (a big if), Big Navi should be significantly more efficient than Ampere (and likely the performance will suffer).

IMO, Turing was two steps back to go 1 step forward. If I could go back in time and tell myself which card to buy, I would tell myself to buy a 1080Ti at launch and use it for 4 years. That's how bad Turing was.

I am not saying that Turing was the best release for consumers at launch but it did drive the industry forward in newer ways othet than MOAR teraflops. As for efficiency, we will get a better grasp of this comparing the 2080ti to the 3070

Yes, RTX does take a performance hit, but Ampere did not improve the efficiency at all either.
 
GTX 1060 to RTX 2060 Super is one of the biggest jumps I've ever experienced.

1060 is like 190 GB/sec memory bandwidth.
2060 Super is around 448 GB /sec
 
GTX 1060 to RTX 2060 Super is one of the biggest jumps I've ever experienced.

1060 is like 190 GB/sec memory bandwidth.
2060 Super is around 448 GB /sec

It's 3 years newer and $100 more expensive (at launch). You should EXPECT a big jump. At the time of the 2060 Super launch, the 1060 was averaging about $150 in the used market (which is probably a good barometer of the performance level).
 
I dont understand why the vega 64 is up there, its nowhere near the performance of a 2080.

If the price and performance of the 1080ti/2080 were exactly the same, i would go with the 2080. Ray Tracing is horrific on it yes, but I do really want to play the rt version of Quake 2. Thats it, a 25 year old game looking shiny is the only good thing to say about the entire 2000 series.
 
I am not saying that Turing was the best release for consumers at launch but it did drive the industry forward in newer ways othet than MOAR teraflops. As for efficiency, we will get a better grasp of this comparing the 2080ti to the 3070

Yes, RTX does take a performance hit, but Ampere did not improve the efficiency at all either.

You literally just said in the OP to pay respect to Turing :p, so why would you then backtrack and say it wasn't the best release for consumers? Either you believe your OP or you don't.

I, for one, refuse to pay respect to Turing.
 
I dont understand why the vega 64 is up there, its nowhere near the performance of a 2080.

If the price and performance of the 1080ti/2080 were exactly the same, i would go with the 2080. Ray Tracing is horrific on it yes, but I do really want to play the rt version of Quake 2. Thats it, a 25 year old game looking shiny is the only good thing to say about the entire 2000 series.

I agree with that. I would just say that the 1080Ti and 2080 have never been the same price/performance, so it's hard to make that determination.

And I agree that the Vega 64 shouldn't be in the same conversation, as it is a different class of card. The Radeon VII might be a better option as it was priced comparably and was somewhat more of a competitor to the other two cards.
 
I wonder how many times more popular? As in how many Turning chips were sold compared to Pascal?

I'm going to be conservative and go with 4X... :shifty:

You mean 4:1 in favor of Pascal, right? Probably pretty close. Looking at the Steam data, Pascal is still about 1.5x what Turing is, and figure a lot of those Turing owners were previously Pascal owners.
 
I agree with that. I would just say that the 1080Ti and 2080 have never been the same price/performance, so it's hard to make that determination.

And I agree that the Vega 64 shouldn't be in the same conversation, as it is a different class of card. The Radeon VII might be a better option as it was priced comparably and was somewhat more of a competitor to the other two cards.
I was under the impression performance between these 2 was pretty much identical as was price...?
 
1080ti sits about 7-10% less frames than 2080 at 1080, 1440, 4K (sauce)

Price wise, 1080ti msrp was $700 and 2080/2080supers were around that too.
 
I was under the impression performance between these 2 was pretty much identical as was price...?

Which card?

The 1080Ti/2080 comparison?
The 2080 was $799 at launch while the 1080Ti was essentially still $699. In the used markets after the launch of the 2080 Super, the 1080Ti was ~$500 and the 2080 ~$600.

Vega64 to the other cards? The LC version was $699, but it has never been anything more than a 1080 competitor with higher power draw. Availability was never great either at least compared to the air V64 and the 1080. It was kind of a halo product.
 
1080ti sits about 7-10% less frames than 2080 at 1080, 1440, 4K (sauce)

Price wise, 1080ti msrp was $700 and 2080/2080supers were around that too.

You can't compare the MSRP of a 2 year old card against it's replacement. I mean why not compare the 980Ti also then as it was similarly priced?

The Super was $699, but that was 3 years after the 1080Ti released.

Throughout the lifecycle of the Turing cards, the 1080Ti was a $500-550 used card depending on the model for the most part.

Edit: Not sure what the ACC schedule has to do with the discussion ;).

Edit2: I should have read the poll more closely. I think I actually voted for the 2080 even though I still think Turing wasn't great. I think the 2080 is a better card, but it should be as it released 2 years after Pascal based cards.
 
Last edited:
  • Like
Reactions: Parja
like this
Which card?

The 1080Ti/2080 comparison?
The 2080 was $799 at launch while the 1080Ti was essentially still $699. In the used markets after the launch of the 2080 Super, the 1080Ti was ~$500 and the 2080 ~$600.

Vega64 to the other cards? The LC version was $699, but it has never been anything more than a 1080 competitor with higher power draw. Availability was never great either at least compared to the air V64 and the 1080. It was kind of a halo product.

I thought the 2080 was released at $699 as well....was it $799? My bad... Performance is basically the same though, with 1080Ti having more ram & 2080 haveing dlss & rtx...so I thought in my own mind they were pretty similar.
 
I thought the 2080 was released at $699 as well....was it $799? My bad... Performance is basically the same though, with 1080Ti having more ram & 2080 haveing dlss & rtx...so I thought in my own mind they were pretty similar.

FE cards were $799. At release, my thought was why would I want to spend $100 more for the same performance I already had with my 1080Ti? I cancelled my 2080 pre-order (I think it was an $849 EVGA card) and stayed with the 1080Ti. In my mind the greatest disservice Nvidia did to Turing was not have more than 1 card that was clearly 10-20% better than the previous gens flagship, especially when that 1 card was so much more expensive than the previous flagship.
 
Putting the Vega 64 LC up there was sort of a joke as AMD really did not have much to offer in that class until the 5700xt.

Also, I never said Turing was the best for consumers at launch in any way. I was simply eating that the larger development program was important for the industry and eventually consumers.
 
Back
Top