2080 Ti for $1000 confirmed, HDMI 2.1/VRR confirmed dead for another year

We didn't get any perf gains from the tensor cores with the Titan V, so if it were capable of that then Nvidia must have been sandbagging with the drivers.

Maybe though I probably find it more realistic that game devs need to program their games to utilize tensor cores.
 
If the performance was there he couldn't have resisted showing it, I wanna be wrong when i say this but i think everyone is going to be shocked when they realize they bought slightly better 1080ti with a feature not many will utilize (at least for a while)

Remember physx
 
Outside making raytracing viable and a new type of AA ( which I don't know if the new AA works w/o raytracing), there's zero use for Tensor cores in gaming unless things have changed.
 
If the performance was there he couldn't have resisted showing it, I wanna be wrong when i say this but i think everyone is going to be shocked when they realize they bought slightly better 1080ti with a feature not many will utilize (at least for a while)

Remember physx

don't forget, they still have a month to sell the 10 series.

about PhysX, that was nvidia feature, ray tracing is not something that belongs to Nvidia, it has always been there, as open source.
 
This is absolutely a wonderful opinion piece. I think this is exactly what is going on. Nvidia is cashing in on the RTX hype train before true uplift in performance with 7nm, for now going for people's wallets. He even mentions how all the demos shown were ray tracing related with not one actual performance comparison. I think reasoning for 2080ti being released now makes sense to, to cash in now before 7nm. And boy are they cashing in!

 
RV2HqlK.jpg
 
no benches til sept 20ish prob

I watched that video and I agree with him there at the end (of course, if the opinion about 10-15% performance is accurate) nvidia is cash grabbing before 7nm using rtx as a hype train to make people feel their pascal cards are now outdated.. haha.. epeens ..

I do hope im wrong of course.. but hard to see how especially considering nv didnt show a single mark about performance and that gloatin sob jensen wouldnt be able to resist if he had something good to show.
 
no benches til sept 20ish prob

I watched that video and I agree with him there at the end (of course, if the opinion about 10-15% performance is accurate) nvidia is cash grabbing before 7nm using rtx as a hype train to make people feel their pascal cards are now outdated.. haha.. epeens ..

I do hope im wrong of course.. but hard to see how especially considering nv didnt show a single mark about performance and that gloatin sob jensen wouldnt be able to resist if he had something good to show.

Yea that is whats fishy. No gaming benchmarks, all RTX hype and no benches with one month full of preordering. Cashing in on the fanboys and then benchmarks will come out. Its pretty crazy how Nvidia can just control this entire launch. I mean can't blame them people are FOMOing hard as they are so desperate to upgrade.
 
This is absolutely a wonderful opinion piece. I think this is exactly what is going on. Nvidia is cashing in on the RTX hype train before true uplift in performance with 7nm, for now going for people's wallets. He even mentions how all the demos shown were ray tracing related with not one actual performance comparison. I think reasoning for 2080ti being released now makes sense to, to cash in now before 7nm. And boy are they cashing in!



I don’t think they usually show benches? I don’t remember any. How do you show a quality improvement which is what they were going for if nothing else can actually run the feature? What do you even bench against?

They could have just slapped 50% more cuda cores in the same die space. Which I’d buy that for $1200 in a heart beat. Instead they did this.

I want to add I have always been skeptical of something that requires game by game implementation which both ray tracing and tensor AA require.

What’s the end game though? I figure it could be the most efficient way to improve graphics. A more sinister side is to lock everyone out of the market. If they were to deliver hard core no one else has the tech, which is probably patented I would think, to get anywhere close.

I give nVidia more credit than simple cash grabs...
 
The question is... given the availability of the 1080/1080Ti, what will be the 2070's selling point? To an extent, this affects the 2080 as well given it's price.

Without actual test data to go on right now it's anyone's guess, but if you ask me: I expect the Pascal cards to start going out of stock real soon.

Did they mention it at the unveiling that it will be much more powerful than the 1080ti?

All of the stuff during the presentation was focused on "RTX ops" and Tensor ops and they kind of glossed over performance if you didn't account for the RT and other stuff. If you went by their marketing ops, the 2080ti is 4x faster than a 1080ti. All those RTX ops.

But GPUDB has them listed as:

1080ti:
GPU Clock 1481 MHz
Boost Clock 1582 MHz

Pixel Rate 139.2 GPixel/s
Texture Rate 354.4 GTexel/s
FP16 (half) performance 177.2 GFLOPS (1:64)
FP32 (float) performance 11,340 GFLOPS
FP64 (double) performance 354.4 GFLOPS (1:32)

2080ti:
GPU Clock 1350 MHz
Boost Clock 1545 MHz

Pixel Rate 136.0 GPixel/s
Texture Rate 420.2 GTexel/s
FP16 (half) performance 210.1 GFLOPS (1:64)
FP32 (float) performance 13,448 GFLOPS
FP64 (double) performance 420.2 GFLOPS (1:32)

So if that holds true, it would put 208ti about 15-20% faster than a 1080ti, before you start counting "RTX ops".

I guess we will have to wait and see, but I have a feeling they were focused on "RTX ops" because no one would spend $1200 for a 15% performance boost over a 1080ti.

Have they showed any gaming benchmarks yet? Can’t find any

I think the closest thing we got to a performance number was when JHH introduced the Unreal: Infiltrator demo. He said that the card they were running on (2080ti I'm pretty sure) was pushing 70+ frames per second, but the demo was locked to 60hz with vsync. The framerate counter they had going showed a solid 59-60 the whole time. For comparison he said the 1080ti ran it around 40FPS. To me that says that the RT didn't hurt performance as much as they said it would (otherwise the 1080ti would be running in the 10-20fps range), or maybe I misunderstood what he was trying to say.
 
A. Jensen said he would give each dev a turing. Face looked full of regret. B. Jensen said 1 turing is twice as fast as 1080ti in 4k high detail, same face.
 
Without actual test data to go on right now it's anyone's guess, but if you ask me: I expect the Pascal cards to start going out of stock real soon.



All of the stuff during the presentation was focused on "RTX ops" and Tensor ops and they kind of glossed over performance if you didn't account for the RT and other stuff. If you went by their marketing ops, the 2080ti is 4x faster than a 1080ti. All those RTX ops.

But GPUDB has them listed as:

1080ti:
GPU Clock 1481 MHz
Boost Clock 1582 MHz

Pixel Rate 139.2 GPixel/s
Texture Rate 354.4 GTexel/s
FP16 (half) performance 177.2 GFLOPS (1:64)
FP32 (float) performance 11,340 GFLOPS
FP64 (double) performance 354.4 GFLOPS (1:32)

2080ti:
GPU Clock 1350 MHz
Boost Clock 1545 MHz

Pixel Rate 136.0 GPixel/s
Texture Rate 420.2 GTexel/s
FP16 (half) performance 210.1 GFLOPS (1:64)
FP32 (float) performance 13,448 GFLOPS
FP64 (double) performance 420.2 GFLOPS (1:32)

So if that holds true, it would put 208ti about 15-20% faster than a 1080ti, before you start counting "RTX ops".

I guess we will have to wait and see, but I have a feeling they were focused on "RTX ops" because no one would spend $1200 for a 15% performance boost over a 1080ti.



I think the closest thing we got to a performance number was when JHH introduced the Unreal: Infiltrator demo. He said that the card they were running on (2080ti I'm pretty sure) was pushing 70+ frames per second, but the demo was locked to 60hz with vsync. The framerate counter they had going showed a solid 59-60 the whole time. For comparison he said the 1080ti ran it around 40FPS. To me that says that the RT didn't hurt performance as much as they said it would (otherwise the 1080ti would be running in the 10-20fps range), or maybe I misunderstood what he was trying to say.
the infiltrator demo did not have Ray tracing. It was a demo for that DLSS.
 
A. Jensen said he would give each dev a turing. Face looked full of regret. B. Jensen said 1 turing is twice as fast as 1080ti in 4k high detail, same face.

I was sort of weirded out by that. Like... how do their software guys operate at all if they don't have the hardware already? Does he make them buy these things themselves?

Or did he mean he was going to give each of them one to keep? I'd be kind of insulted if that was the bonus he offered me for ten years of hard work that will ultimately make him billions. "Oh, you're going to give me a $1000 graphics card that I helped you design? How about a fucking raise instead, asshole?"
 
Without actual test data to go on right now it's anyone's guess, but if you ask me: I expect the Pascal cards to start going out of stock real soon.

Yes it's all speculation at this point.

But given specs alone, you can normally (but of course, not always) spot the reason why next generation's card should/should not be significantly faster than previous gen's.

This is purely speculation on my part:
Maxwell to Maxwell2 -- 780 vs 980 was more of minor improvement and a big 20-30% clock speed increase, hence the ~20% improvement
Maxwell2 to Pascal -- 980 vs 1080 had a large improvement probably because it was a combination of more CUDA cores and vastly higher clock speeds.
Pascal to Turing (2070 vs 2070, 2080 vs 2080)-- seems like a wash in terms of CUDA cores and clock speeds. Not expecting a very large jump unless the architecture itself is vastly superior to Pascal's. The 2080 Ti seems to bring a lot to the table though, except I'm not willing to pay for it given the games I play.
 
I was sort of weirded out by that. Like... how do their software guys operate at all if they don't have the hardware already? Does he make them buy these things themselves?

Or did he mean he was going to give each of them one to keep? I'd be kind of insulted if that was the bonus he offered me for ten years of hard work that will ultimately make him billions. "Oh, you're going to give me a $1000 graphics card that I helped you design? How about a fucking raise instead, asshole?"

Well that's better than some other outfits. Call party, 75 percent of team has pink slips in that party.
 
Yes it's all speculation at this point.

But given specs alone, you can normally (but of course, not always) spot the reason why next generation's card should be significantly faster than previous gen's.

IE.
Maxwell to Maxwell2 -- 780 vs 980 was more of minor improvement and a big 20-30% clock speed increase, hence the ~20% improvement
Maxwell2 to Pascal -- 980 vs 1080 had a large improvement probably because it was a combination of more CUDA cores and vastly higher clock speeds.
Pascal to Turing -- seems like a wash in terms of CUDA cores and clock speeds. Not expecting a very large jump unless the architecture itself is vastly superior to Pascal's
Doesn't the Ti version Turing card have about 15% more CUDA cores than Pascal? I seem to recall the clock speeds being pretty close, so one would hope it's maybe 10 to 15 percent faster.
 
Doesn't the Ti version Turing card have about 15% more CUDA cores than Pascal? I seem to recall the clock speeds being pretty close, so one would hope it's maybe 10 to 15 percent faster.
The TI Turing brings a lot, yes, but the 2080 and 2070... not really.

On the TI turing though, If it's only 10-15% faster, it seems like a heavily overclocked 1080Ti could close the gap...
 
^ the 2080ti is $1200 LOL

I think the real take away here is perhaps game devs will save tons of hours on creating simulated shadows and lighting. That time will be used elsewhere.

Sorry but buying a gpu for RT alone especially at these prices is beyond insane.

People are going to regret dumping off their 1080ti's etc.
 
what would be kinda neat is if they released a card with just the RT cores to run alongside a pascal to offload the RT to the turing ala physx
 
This announcement makes me wonder about a couple of things. One, I hope that non-RTX performance really is a big jump, but expectations have been set at 50% more than the previous gen, and pricing is in line with that. Secondly, I wonder if parts will be scarce, which combined with a lack of competition would explain a lot. There will be enough people who buy this stuff at $1k for an aftermarket ti or 1.2k for a founders edition, if you're only able to make a limited quantity. Plus there are no natural competitors. I remember one of the earlier GeForce cards that launched at a new high at the time met with outrage - when the Radeon part launched maybe 6 months later, there was a huge price drop (still higher than the Radeon if memory serves).
Finally no midrange parts yet - x060 and x050 parts. What about the old 1060,1070 and 1080 parts? Are they being kept around with new lower prices?
 
Because a heavily overclocked 2080Ti could further increase the gap? I've never understood this logic.

It's self justification for what you bought, either due to real financial limitations or self imposed budget. It's not meant for external consumption, it's just outward projection to re-assure one's self.
 
This is going to be a long month listening to factless conjecture. Release my Kraken, er, Ti already.

10% over 1080 Ti

GTFO
 
The Ti prices are the starting street prices. Or they're a little lower. I was looking at Ti's, mostly.

When do the reviews go up?

edit: is a 50-100% increase going from a AIB vendor 1080 (non Ti) to 2080 Ti a reasonable expectation?
 
This is a bit random, but I don't get these 2.75 slot cards. If you're gonna make it 3 slot, make it 3 slot. It's not like I can use the 0.25 slot. And they should all have 3 actual PCIe brackets to help keep those massive cards from sagging.
 
Yeah... The more I read the more I lean toward canceling my 2080 Ti preorder. It just doesn't seem worth it all things considered. Next generation will be much better with mature RT tech/adoption, 7nm and competition from Intel (and possibly AMD). At least Nvidia lets you cancel before the item is shipped, so I have until Sept 20th to decide. Hopefully we'll get more benchmarks by then.

I was honestly hoping that with 7-10x the compute performance of the previous generation, I could mine with the card while not gaming to defray some of the cost. Mining is basically dead for GPUs now, but the Titan V still makes $50/month, and the 2080 Ti will have more Tensor cores + RT cores, so I figured ~$60/month would be feasible to knock a few hundred off the price over the next few months, but that's a gamble on top of everything else, and I'm honestly not sure if the 2080 Ti is even worth $800 if you already have a 1080 or 1080 Ti. So... lots to consider before Sept 20th.
 
This announcement makes me wonder about a couple of things. One, I hope that non-RTX performance really is a big jump, but expectations have been set at 50% more than the previous gen, and pricing is in line with that. Secondly, I wonder if parts will be scarce, which combined with a lack of competition would explain a lot. There will be enough people who buy this stuff at $1k for an aftermarket ti or 1.2k for a founders edition, if you're only able to make a limited quantity. Plus there are no natural competitors. I remember one of the earlier GeForce cards that launched at a new high at the time met with outrage - when the Radeon part launched maybe 6 months later, there was a huge price drop (still higher than the Radeon if memory serves).
Finally no midrange parts yet - x060 and x050 parts. What about the old 1060,1070 and 1080 parts? Are they being kept around with new lower prices?


Not sure who set those expectations. Most people are saying 20% or so in regular gamin. We shall see, unless IPC has increased significantly increased which I doubt. This is all about ray tracing. Nvidia is making a quick buck knowing 7nm will be here next year so they have to cash in on the hype. Which they are.
 
It's going to be less than 20% overall. This generation of GPUs is a flop. It was a shitty, stupid time to release them anyway given that we're on the cusp of HDMI 2.1. New GPUs should be aligned with new interface standards not shoved out right before a new one's ready. Yet another hardware industry failure.

Here's my pessimistic breakdown of the whole thing.

* Every card in this line up other than the 2080ti is going to be slower and more expensive than the 1080ti you can already buy today, and the 2080ti is only going to be marginally faster; so right out the gate, this entire generation is basically worthless
* None of these cards is going to be able to drive 4k @ 120hz without dropping frames
* We have confirmation that this card isn't going to support HDMI 2.1, which would have been the only useful feature it could have had compared to the 1080 lineup, so it'll be completely worthless for 2019 TVs with HDMI 2.1
* Nvidia's too greedy and evil to actually support HDMI 2.1 VRR; they'll cling to G-Sync module sales until they're absolutely forced out of it
* There's basically no point in being an early adopter on new GPU technologies since all games are designed around consoles anyway. Raytracing demos are a fucking joke until there's an actual console generation of hardware out there with it and games are designed from the ground up with it in mind. Video games aren't designed for PC hardware anymore. They're designed around consoles and released on the PC as a second thought.
* The real version of these cards is coming out next spring with the 7nm refresh, which will give them enough headroom to actually improve performance outside of bullshit bogus raytracing benchmarks that were invented to make themselves look good
* To top everything off, Nvidia's current AI ray tracing technique has an end result that looks like shit; because it's essentially just guessing, there are all kinds of visual anomalies and defects that make it arguably look just as bad as the errors you get with traditional rasterization anyway.

This entire generation of GPUs is shit. Here's hoping that Navi isn't a complete piece of crap. This is a real opportunity for AMD. They could be the only game in town that actually supports HDMI 2.1 VRR on 2019 OLED TVs. Even if their performance isn't as good as Nvidia's, that's a HUGE advantage. Nvidia's blowing it.
 
It's going to be less than 20% overall. This generation of GPUs is a flop. It was a shitty, stupid time to release them anyway given that we're on the cusp of HDMI 2.1. New GPUs should be aligned with new interface standards not shoved out right before a new one's ready. Yet another hardware industry failure.

Here's my pessimistic breakdown of the whole thing.

* Every card in this line up other than the 2080ti is going to be slower and more expensive than the 1080ti you can already buy today, and the 2080ti is only going to be marginally faster; so right out the gate, this entire generation is basically worthless
* None of these cards is going to be able to drive 4k @ 120hz without dropping frames
* We have confirmation that this card isn't going to support HDMI 2.1, which would have been the only useful feature it could have had compared to the 1080 lineup, so it'll be completely worthless for 2019 TVs with HDMI 2.1
* Nvidia's too greedy and evil to actually support HDMI 2.1 VRR; they'll cling to G-Sync module sales until they're absolutely forced out of it
* There's basically no point in being an early adopter on new GPU technologies since all games are designed around consoles anyway. Raytracing demos are a fucking joke until there's an actual console generation of hardware out there with it and games are designed from the ground up with it in mind. Video games aren't designed for PC hardware anymore. They're designed around consoles and released on the PC as a second thought.
* The real version of these cards is coming out next spring with the 7nm refresh, which will give them enough headroom to actually improve performance outside of bullshit bogus raytracing benchmarks that were invented to make themselves look good
* To top everything off, Nvidia's current AI ray tracing technique has an end result that looks like shit; because it's essentially just guessing, there are all kinds of visual anomalies and defects that make it arguably look just as bad as the errors you get with traditional rasterization anyway.

This entire generation of GPUs is shit. Here's hoping that Navi isn't a complete piece of crap. This is a real opportunity for AMD. They could be the only game in town that actually supports HDMI 2.1 VRR on 2019 OLED TVs. Even if their performance isn't as good as Nvidia's, that's a HUGE advantage. Nvidia's blowing it.

Do you have a source that ray tracing is AI guessing? Or are you confusing ray tracing with DLSS?
 
Why do you say mining is deqd for gpus? Serious question, i dont know why havnt heard the news, why the sudden obsolescence ?
Yeah... The more I read the more I lean toward canceling my 2080 Ti preorder. It just doesn't seem worth it all things considered. Next generation will be much better with mature RT tech/adoption, 7nm and competition from Intel (and possibly AMD). At least Nvidia lets you cancel before the item is shipped, so I have until Sept 20th to decide. Hopefully we'll get more benchmarks by then.

I was honestly hoping that with 7-10x the compute performance of the previous generation, I could mine with the card while not gaming to defray some of the cost. Mining is basically dead for GPUs now, but the Titan V still makes $50/month, and the 2080 Ti will have more Tensor cores + RT cores, so I figured ~$60/month would be feasible to knock a few hundred off the price over the next few months, but that's a gamble on top of everything else, and I'm honestly not sure if the 2080 Ti is even worth $800 if you already have a 1080 or 1080 Ti. So... lots to consider before Sept 20th.
 
How's Microcenter on preorders and cancelling?

If you do it on the web, they probably treat it like any web order. You don't need to put in any credit card info doing a web order, and you can't be charged unless you show up and get the card.
No real need to cancel at all. If the reviews and benchmarks show that it's not worth it, then you can just not show up and after 3 days they pull the card that was set aside for you and they stick it on the shelf so someone else can get it.
 
Not sure who set those expectations. Most people are saying 20% or so in regular gamin. We shall see, unless IPC has increased significantly increased which I doubt. This is all about ray tracing. Nvidia is making a quick buck knowing 7nm will be here next year so they have to cash in on the hype. Which they are.

Ray tracing and DLSS (the tensor core aa). If you don’t care about those buy a 1080ti or wait.

What do we know about 7nm? Can it do over 600mm^2 (or can it fit more transistors since we know they can be misleading with xx nm). If not I don’t see it being vastly superior to 12nm but everyone seems to be talking about it. Did a quick search last night and couldn’t find anything.
 
This is absolutely a wonderful opinion piece. I think this is exactly what is going on. Nvidia is cashing in on the RTX hype train before true uplift in performance with 7nm, for now going for people's wallets. He even mentions how all the demos shown were ray tracing related with not one actual performance comparison. I think reasoning for 2080ti being released now makes sense to, to cash in now before 7nm. And boy are they cashing in!


Pretty much the 2080ti is the only thing you should consider preordering as it is guaranteed to be faster than anything else. The 2080 could very well lose to the 1080ti and the 2070 might fall to the 1070ti, both at higher prices.

If I did have money for something like the 2080ti, I would be looking at the Inno3D model. It has a 240mm AIO fan - looks interesting!

http://www.inno3d.com/products_detail.php?refid=386
 
Back
Top