How do you feel about Nvidia pricing on the new RTX video cards?

The pricing is laughable.

I'm sitting on a Titan Xp and these prices and performance do nothing for me. And that BFV Ray tracing video just makes Ray tracing look like a stupid gimmick right now. Just like Tessellation ultimately ended up being imho. I know it's the future, but not with these cards just yet.

"What's the point?", as far as I'm concerned.


Haha I thought the same. They were bragging about realistic, I am like okay everything is super shiny now and light coming through a window is a perfect rectangle with equal sides and rest of the room is dark? So much for realistic. In which world that much light coming from window does not illuminate other parts of the room? Yea it looked shiny but hardly realistic.
 
its those damn miners!

SLI perfomance tho, is it now handled by hardware or necessary have buggy code from developer
 
its those damn miners!

SLI perfomance tho, is it now handled by hardware or necessary have buggy code from developer

Well it'll be via NVLink now, rather than SLI. Too early to say how well it'll work. Did some googling, this seems to cover NVLink versus SLI:

Reddit: SLI versus NVLink

So like SLI is syncing the two cards to work together in tandem, whereas NVLink 'simply' provides direct memory access between the cards. How that is leveraged to increase gaming performance I wouldn't know. It was apparently disabled on the Titan V.
 
Who is this Ray Tracing, and why did he cause the prices of cards to go up?

Before I even THINK about dropping hundreds on a card, I'll wait for the [H] review. Only then will I sell my kidney.
 
So, it seems I'll either stick to my GTX 970 or maybe get a used 1080Ti.
But no, fuck this :D Even the 1080Ti at ~800€ is a bit much to me, but I would've been okay with it. This? No. Just no.
 
Excited for the 2070 but def waiting for the [H] reviews. We already heard from leaks that the new gen would be priced higher- maybe even with these exact figures. $500 is a lot for the 70-series card, but every generation or so is priced higher than the previous. My 970 cost $330... maybe 350 msrp? Wasn't the 1070 $400 at launch?
Between ahole miners (is that still a thing???) and AMD's lack of...anything... nv knew they could raise prices. Were they making money off the mining craze or just the retailers? If not now is when they cash in?

Luckily I have a couple months before I upgrade- plenty of time to learn 2070's value.

Ray tracing looks incredible. It's probably the last piece preventing games from appearing realistic. I hope it's as (relatively) easy and fast to implement as nvidia claimed in the keynote...maybe it will be tacked on to some new releases that hadnt planned to use it. We know Unreal Engine will support it...considering probably half the games being made use UE... good chance we'll get some nice RT surprises in the next year. Assuming it isn't a complete dud...

Finally- I can get behind the new RTX name, but going from 1000 to 2000 instead of 1100 will forever make me angry. THAT'S NOT HOW NUMBERS WORK. 500, 600, 700, 800/sorta..., 900, 1000, 2000? NO! GO TO YOUR ROOM!
 
Was looking at new card price announcements as an upgrade for my Linux/1060 6GB.

Found a GTX 1080 locally in the return bin for $242 + tax...

I went 1080.

Linux box = 1080
Win box = 1070 SLI.

In no way did I ever pay more than about $275 for any of my graphics cards.
 
its those damn miners!

SLI perfomance tho, is it now handled by hardware or necessary have buggy code from developer

APIs haven't changed yet, so any requirements to make SLI/mGPU work remain the same AFAIK. (e.g. DX11 and earlier - Nvidia and game devs. DX12 - 100% game devs)
 
Given the new tech on the chip, smaller node, and massive die, I think the pricing is justified at this point in the game. As we move forward I think we can expect prices to fall back to "normalcy" with future generations.
APIs haven't changed yet, so any requirements to make SLI/mGPU work remain the same AFAIK. (e.g. DX11 and earlier - Nvidia and game devs. DX12 - 100% game devs)
Theoretically, NVLink would make two cards appear as one to the API, so the API would not have to change at all. Games would continue seeing only one card and treating it as such. It would take the game developers out of the picture. I am interested in seeing exactly how NVIDIA handled it on the gaming side.
 
Honestly. Nvidia is brilliant, they know people would buy anything they sell after the success of pascal. I mean look at it, preorders sold out and also absolutely no reviews. Everyone bought this on hype, hell nvidia didn't even give direct comparison to current gen in games. all they talked about was RTX opps. Thats it, no direct gaming performance like they showed when they launched pascal. They know people are going to buy and fanboys are nuts. Can't blame them. I honestly never thought I would see a day where a GPU that is 1200 bucks will sell out without even a single review. That is how bullish nvidia about the mindshare. Absolutely crazy! Almost double the price of last gen on the Ti and people bought them without even seeing a single review.

That presentation was boring as fuck and all he talked about was ray tracing, all he compared was ray tracing calculations. Hey turing is 5-6x faster and gtx 2070 is faster than titan xp in guess what? Ray tracing opps, lol! I just think alot of people took that as a fact that they are getting double the performance of titan xp.

Nvidia is like Apple of video cards now. They make people buy, no questions asked. This is the new norm now, 1200 dollar top of the line video card that used to be 699 and sold out without any reviews. Just brilliant by nvidia. They own their fanbase lol.


Ray tracing is nice but I don't get this thing about realistic image. Seriously nothing is that shiny in real life. The metro image he was showing, where the light was in a square around the desk it was unrealistic as hell, I mean you are telling me when light comes through the window it reflects perfectly as the size of the window and light does not go anywhere else? I mean its not pitch dark around the table if they were trying to make a realistic image.

AMEN BROTHER. Apple and Nvidia customers who buy with no questions asked have a cult like mentallity I've always thought that and continue to.
 
I feel like ill be keeping my 1080ti and hope AMD comes out with something in a year or so, i only play WoW so no need for me to spend a mortgage payment on a graphics card
 
Really going to depend on benchmarks. If they're good, I'll hop on.

There's literally no competition in the high end so this is what happens.
 
I feel like ill be keeping my 1080ti and hope AMD comes out with something in a year or so, i only play WoW so no need for me to spend a mortgage payment on a graphics card
I am really interested in the mentality of "if AMD comes out..." If they had a card that can match 1080ti/2080ti performance, would they sell for much less? Looking at Vega, I doubt so.
 
LOL at people defending these prices. Love the phan boi stupidity posts as they are always good for laughs. :ROFLMAO:
 
AMEN BROTHER. Apple and Nvidia customers who buy with no questions asked have a cult like mentallity I've always thought that and continue to.
I disagree. People pre-order the newest Apple stuff because they're Apple fans. They aren't necessarily the best phones, but to Apple fans they are.

People pre-order the newest Nvidia card because it will be the undisputed fastest card. It doesn't matter if you're a fan of Nvidia or not. If you're a fan of having the fastest gaming PC right now you buy Nvidia.
 
Given the new tech on the chip, smaller node, and massive die, I think the pricing is justified at this point in the game. As we move forward I think we can expect prices to fall back to "normalcy" with future generations.

Theoretically, NVLink would make two cards appear as one to the API, so the API would not have to change at all. Games would continue seeing only one card and treating it as such. It would take the game developers out of the picture. I am interested in seeing exactly how NVIDIA handled it on the gaming side.

I must have missed that in the reveal, because I don't remember that at all.
 
If I didn't have a family to feed I'd probably go for it and get the latest and greatest. But at these prices I'll just stick with my 1080 for now.
 
I must have missed that in the reveal, because I don't remember that at all.
It was during the part where Jensen pulled out the DGX. I say "theoretically" because the functionality of the Tesla hardware is most likely different from GeForce. Who knows, NVLink on GeForce may simply keep traditional SLI behavior and just utilize the increased bandwidth.
 
People pre-order the newest Nvidia card because it will be the undisputed fastest card. It doesn't matter if you're a fan of Nvidia or not. If you're a fan of having the fastest gaming PC right now you buy Nvidia.

The 2080 Ti is going to be the fastest gaming level GPU on the market. You might dislike the cost and that's fair enough but few will dislike the performance.
 
Pricing seems nuts to me, but I'm an idiot and will probably bite. Am really curious about benchmarks, though. Getting one is tough enough as-is, if the benchmarks ARE good they might be sold out for 8 months.
 
the new ray-tracing tech made it inevitable that prices were going to be higher then normal...I still think Nvidia is crazy to overhype this so soon...it's going to take multiple generations to refine the tech

From everything that I saw, the ray-tracing is supposed to be easier to implement compared to what we use today. I suppose that means the responsibility is on Nvidia to make sure the drivers work as expected. Another aspect that I think it getting overlooked is the extensive push for A.I. to render things it knows are supposed to be there as compared to what really is in the frame. My guess is the A.I. components are really what are behind the big claims in ray-tracing and everything else new with the generation.
 
While I think the pricing is crazy, what is crazier to me is how similar RTX is to RX(you know from that other brand?).

For a brand conscious company like Nvidia, I'm not sure if this is a foobar.
 
It was during the part where Jensen pulled out the DGX. I say "theoretically" because the functionality of the Tesla hardware is most likely different from GeForce. Who knows, NVLink on GeForce may simply keep traditional SLI behavior and just utilize the increased bandwidth.

Oh, so he didn't expressly say that multi card functionality had changed. Yeah, something like that would (should?) have been a key reveal I would think. I'll be pleasantly surprised if it turns out to be the case, as multi card gaming would (likely) become worthwhile again.
 
LOL young whipper snapper....get back to me when you drop $5,000 of early '90s money on a 486 50mhz so you can play Dynamix Red Baron and Broderbound Stunts at a proper framerate!

That's the thing though.... you pretty much *had* to pay that if you wanted to play the game. Whereas now, you can play the game at 1080p 60 FPS for a fraction of what a new card (and monitor that can utilize it) cost.
 
Oh, so he didn't expressly say that multi card functionality had changed. Yeah, something like that would (should?) have been a key reveal I would think. I'll be pleasantly surprised if it turns out to be the case, as multi card gaming would (likely) become worthwhile again.

If you read the blurb on the 2080 product page, I get the impression it's just an amped up SLI as Armenius suggested.

https://www.nvidia.co.uk/geforce/graphics-cards/rtx-2080-ti/#sli
The GeForce RTX NVLink™ bridge connects two NVLink SLI-ready graphics cards with 50X the transfer bandwidth of previous technologies. This means you can count on super-smooth gameplay at maximum resolutions with ultimate visual fidelity in GeForce RTX 2080 Ti and 2080 graphics cards.
 
Honestly. Nvidia is brilliant, they know people would buy anything they sell after the success of pascal. I mean look at it, preorders sold out and also absolutely no reviews. Everyone bought this on hype, hell nvidia didn't even give direct comparison to current gen in games. all they talked about was RTX opps. Thats it, no direct gaming performance like they showed when they launched pascal. They know people are going to buy and fanboys are nuts. Can't blame them. I honestly never thought I would see a day where a GPU that is 1200 bucks will sell out without even a single review. That is how bullish nvidia about the mindshare. Absolutely crazy! Almost double the price of last gen on the Ti and people bought them without even seeing a single review.

That presentation was boring as fuck and all he talked about was ray tracing, all he compared was ray tracing calculations. Hey turing is 5-6x faster and gtx 2070 is faster than titan xp in guess what? Ray tracing opps, lol! I just think alot of people took that as a fact that they are getting double the performance of titan xp.

Nvidia is like Apple of video cards now. They make people buy, no questions asked. This is the new norm now, 1200 dollar top of the line video card that used to be 699 and sold out without any reviews. Just brilliant by nvidia. They own their fanbase lol.


Ray tracing is nice but I don't get this thing about realistic image. Seriously nothing is that shiny in real life. The metro image he was showing, where the light was in a square around the desk it was unrealistic as hell, I mean you are telling me when light comes through the window it reflects perfectly as the size of the window and light does not go anywhere else? I mean its not pitch dark around the table if they were trying to make a realistic image.

I don't really see the issue with people pre-ordering. You already know this will be the single fastest card if that is what you are after and you can easily cancel the pre-order if benchmarks don't align with your performance -> dollars expectation. I would say pre-ordering is the safe bet if you think you might want one based on previous Ti card availability at launch.

I don't personally have high hopes for ray tracing in games in the short term, but I am probably in for a 2080 Ti for machine learning either way.
 
I'm willing to reserve judgement until the reviews come out. My first impression is that I'm not happy, but if these cards have insane performance to match their insane pricing, I say it's fine. By that I mean the RTX 2080 Ti needs to perform about twice as fast as a GTX 1080 Ti, with the RTX features turned off. Otherwise this pricing is too expensive IMO.
 
I'm willing to reserve judgement until the reviews come out. My first impression is that I'm not happy, but if these cards have insane performance to match their insane pricing, I say it's fine. By that I mean the RTX 2080 Ti needs to perform about twice as fast as a GTX 1080 Ti, with the RTX features turned off. Otherwise this pricing is too expensive IMO.

This!
 
I don't really see the issue with people pre-ordering. You already know this will be the single fastest card if that is what you are after and you can easily cancel the pre-order if benchmarks don't align with your performance -> dollars expectation. I would say pre-ordering is the safe bet if you think you might want one based on previous Ti card availability at launch.

I don't personally have high hopes for ray tracing in games in the short term, but I am probably in for a 2080 Ti for machine learning either way.

yea my only concern is if Nvidia is pulling a smart one here. Holding the drivers back up until september 20th. I have been searching there is absolutely no word on performance or even date for reviews. The only way they have such a tight grip is if they held back the drivers from leaking out. This just makes me think outside of ray traced games we might not see a huge performance increase. If reviews are done 2 weeks prior to release it makes sense so you can still cancel. It would be something if Nvidia allows the reviews to go up after september 20th and not hand out drivers until a week before. We might see some leaks, but by the time reviews are out preorders have likely shipped.
 
The card does 48 fps in Shadow of Tomb Raider w/ RT on. I say time to hit that buy button hard!
 
The card does 48 fps in Shadow of Tomb Raider w/ RT on. I say time to hit that buy button hard!

I read 2080ti was doing lesst than 60fps on the demo floor at 1080p and the game is one month away and frame rate drop was very bad and you could easily tell. Falling well below 60.
 
Come on guys, it's only ONE mortgage payment. Doesn't everyone here have that "skip a mortgage payment" option from their bank? :p
 
Come on guys, it's only ONE mortgage payment. Doesn't everyone here have that "skip a mortgage payment" option from their bank? :p

Let me find out ROFL! I have never skipped that sucker in 7 years hahahaha. But even then I am not sure if I would buy this thing before I see some numbers. lol
 
I'm surprised in this discussion of pricing nobody has mentioned the difference in die size between Pascal and Turing.

Turing has a die size of 775 mm^2 on the RTX 2080 Ti.
Pascal has a die size of 471 mm^2 on the GTX 1080 Ti.

That's a 60% increase in die size. As we all know silicon is expensive. And there can be failures. With the bigger size of the die, I think you will see more failures.

Therefore lower useable yields. Of course there will be some 'partial failures' that can still be turned into RTX 2070's with some sections disabled. But that has been a common practice for years now.

But this could help explain why pricing has increased so sharply, maybe. Bigger die size means fewer functional Gpu's obtained per wafer manufactured.
No word yet on yield percentage, but maybe Nvidia doesn't want us to know their success/failure rate.

In short maybe Nvidia is passing on their manufacturing costs directly to the consumer, bigger chip = more expensive to produce.

My main concern with Turing, price aside, is power consumption, heat, and heat dissipation. I believe Nvidia finally went to a two fan Founder's Edition cooler design because... they had no other choice. The new GPU runs so hot with its billions of transisters and larger surface area, that this was the only reasonable way for them to keep temps cool enough to avoid thermal throttling.

This brings to mind the release of the GTX 480 which gave an amazing performance boost but ran extremely hot and was very power hungry.
It ran so hot that people attempted to cook their eggs on their 480's just to see if it would work!

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9PLzYvMjQzMjIyL29yaWdpbmFsL2VnZ19jb29raW5nLmpwZw==.jpg


I'm not saying this is the case for sure with Turing, but the larger die size, and the sudden change in cooling solutions makes me wonder just how HOT that spanking new chip is going to run?!
 
Last edited:
Seems to me if you avoid showing benchmarks of modern games with the Ray Tracing capabilities, and spend time discussing "needing a new way to measure performance" then the benchmarks for current games are going to be really underwhelming. And I didnt really love the demos they showed, it all felt a little artificial and less realistic.
 
I'm surprised in this discussion of pricing nobody has mentioned the difference in die size between Pascal and Turing.

Turing has a die size of 775 mm^2 on the RTX 2080 Ti.
Pascal has a die size of 471 mm^2 on the GTX 1080 Ti.

That's a 60% increase in die size. As we all know silicon is expensive. And there can be failures. With the bigger size of the die, I think you will see more failures.

Therefore lower useable yields. Of course there will be some 'partial failures' that can still be turned into RTX 2070's with some sections disabled. But that has been a common practice for years now.

But this could help explain why pricing has increased so sharply, maybe. Bigger die size means fewer functional Gpu's obtained per wafer manufactured.
No word yet on yield percentage, but maybe Nvidia doesn't want us to know their success/failure rate.

In short maybe Nvidia is passing on their manufacturing costs directly to the consumer, bigger chip = more expensive to produce.

My main concern with Turing, price aside, is power consumption, heat, and heat dissipation. I believe Nvidia finally went to a two fan Founder's Edition cooler design because... they had no other choice. The new GPU runs so hot with its billions of transisters and larger surface area, that this was the only reasonable way for them to keep temps cool enough to avoid thermal throttling.

This brings to mind the release of the GTX 480 which gave an amazing performance boost but ran extremely hot and was very power hungry.
It ran so hot that people attempted to cook their eggs on their 480's just to see if it would work!

View attachment 97854

I'm not saying this is the case for sure with Turing, but the larger die size, and the sudden change in cooling solutions makes me wonder just how HOT that spanking new chip is going to run?!

I think the process has matured quiet a bit. I think they will be power limited but yea it will be a challenge to cool it down at high clocks I think. Thats why you see 2080ti with lower clocks than previous card I think. But yea that is a giant die size for sure.
 
I feel like the pricing is both unfortunate and understandable. RTX seems to be a completely new breed of GPU. They're different enough that (and I know some of this is marketing bs) Nvidia felt the need to create a new performance metric for them. I see that, and I see the time gap between the last gen and today, and I think: "There was a LOT of R&D cost in this."

That said... it's unfortunate because I personally will never pay that much for a graphics card. Especially when I know that a lot of the card's tech isn't and won't be utilized by developers for a long time. Especially when my 1080ti drives my 165hz 1440p monitor just fine.

So... hard pass from me, and it's honestly the price that does it. I don't mind being an early adopter, but not at that premium. Looks like my CPU will be getting the love on this next upgrade cycle.
 
Back
Top