NVIDIA RTX 2080 and 2080 Ti Unclothed @ [H]

Real time ray tracing isn't some proprietary tech like PhysX and Hairworks. It's built into the DX12 API now as has been said ad nauseam elsewhere, and it's coming to Vulkan. Major game engines like Unity and Unreal Engine 4 also have it built-in already. The bump in the road up until now was it still being too expensive with existing hardware. RTX is the jumping off point for that. It'll be slow to adapt, true, but all new graphic technologies take time to reach ubiquity. Ray tracing is a game changer both in visual effects and actual development, so it is definitely no gimmick.
Will, no, but neither were physX and tesselation. But they aren't widely used to this day. Granted, Ray tracing is a little different in terms of potential Fidelity impact, but the point stands: by the time it is common enough to matter, the next gen will be here and have it polished.
 
Will, no, but neither were physX and tesselation. But they aren't widely used to this day. Granted, Ray tracing is a little different in terms of potential Fidelity impact, but the point stands: by the time it is common enough to matter, the next gen will be here and have it polished.

I agree it’s not a gimmick and there is an early adopter risk, but if someone has the cash go for it. Considering it’s a 1/3 of the die nVidia will be pushing the new tech [H]ard. I think there will be great improvements. Even the Tomb Raider team said they are still working on polishing it. I might get it and Tomb Raider, Mech Warrior V and Battlefield V are enough game time to justify it. They all support RT and Tensor core nonsense.

It’s a cheap hobby. In 18-24 months when 7nm launches if it’s actually an improvement you sell this for a few hundred and move on. Hell my monitor was $1200 and my Vive Pro $1200, I don’t mind spending $700 to power them (-$500 from selling my 1080ti).

Guess what - 5nm will be two years after that. ;)

We shouldn’t be so negative. This is a great shift from the norm and if early adopters want to take a brunt cheer them on IMO.
 
Real time ray tracing isn't some proprietary tech like PhysX and Hairworks. It's built into the DX12 API now as has been said ad nauseam elsewhere, and it's coming to Vulkan. Major game engines like Unity and Unreal Engine 4 also have it built-in already. The bump in the road up until now was it still being too expensive with existing hardware. RTX is the jumping off point for that. It'll be slow to adapt, true, but all new graphic technologies take time to reach ubiquity. Ray tracing is a game changer both in visual effects and actual development, so it is definitely no gimmick.

I agree with you that ray tracing is here to stay, unlike the other things like PhysX (which also had great engine support too, btw. UE3 was king in those days).

To me the problem is that games must be architected for the lowest common denominator - today that's the base model Xbox One. Next gen consoles will have vastly improved CPUs which will enable simulations and AI that we just don't see right now. Unfortunately next-gen consoles likely won't have any ray tracing capabilities, which leaves us with RTX features like better shadows or reflections being bolted on after the fact and having limited total impact to the experience.

Jen-Hsun said it in the presentation: today developers have to bake lighting into their artwork - with ray-tracing they don't have to. Unfortunately as long as games still have to run on platforms without ray-tracing the art will fundamentally be designed for rasterized lighting models. I don't see devs investing a lot of time in doing two sets of artwork.

Another great example was the Metro scene where they mentioned that with real lighting the devs could hide a monster up in the rafters. That would be super freaking awesome and a very Metro thing to do. But, again, the dev has to design the experience around the lowest common denominator, which again makes this scenario unrealistic right now.

I'm really excited about the future of ray tracing, but I'd have much prefered better rasterization performance this generation and saved the ray-tracing for the next generation or two when we start seeing more diminished returns on rasterization performance to the user experience.
 
I agree with you that ray tracing is here to stay, unlike the other things like PhysX (which also had great engine support too, btw. UE3 was king in those days).

To me the problem is that games must be architected for the lowest common denominator - today that's the base model Xbox One. Next gen consoles will have vastly improved CPUs which will enable simulations and AI that we just don't see right now. Unfortunately next-gen consoles likely won't have any ray tracing capabilities, which leaves us with RTX features like better shadows or reflections being bolted on after the fact and having limited total impact to the experience.

Jen-Hsun said it in the presentation: today developers have to bake lighting into their artwork - with ray-tracing they don't have to. Unfortunately as long as games still have to run on platforms without ray-tracing the art will fundamentally be designed for rasterized lighting models. I don't see devs investing a lot of time in doing two sets of artwork.

Another great example was the Metro scene where they mentioned that with real lighting the devs could hide a monster up in the rafters. That would be super freaking awesome and a very Metro thing to do. But, again, the dev has to design the experience around the lowest common denominator, which again makes this scenario unrealistic right now.

I'm really excited about the future of ray tracing, but I'd have much prefered better rasterization performance this generation and saved the ray-tracing for the next generation or two when we start seeing more diminished returns on rasterization performance to the user experience.
The good news is that tech like this eventually finds its way into consoles. How long it takes for it to be affordable and efficient in that product segment is the real question, though. Consoles are just getting to the point where they can almost do 4K 60 FPS, and now they may have to contend with falling back to 1080p or worse for ray tracing. Considering consoles being inexorably tied to the television market I fear they will stick with the former for quite a long time. But then again, streaming services may change that. So many questions and what-ifs...
 
The good news is that tech like this eventually finds its way into consoles. How long it takes for it to be affordable and efficient in that product segment is the real question, though. Consoles are just getting to the point where they can almost do 4K 60 FPS, and now they may have to contend with falling back to 1080p or worse for ray tracing. Considering consoles being inexorably tied to the television market I fear they will stick with the former for quite a long time. But then again, streaming services may change that. So many questions and what-ifs...

Yeah, for sure. I think people today are much too focused on pixel count vs pixel quality. A 1080p game with true ray-traced lighting, AO, shadows, and reflections would make the rasterized version at 4K look silly... the contrast would be akin to Doom 2016 at 1080p vs Quake 1 at 4K.

Another thing that I'm concerned about is AMD/console ability to perform ray-tracing with the same APIs. It must a tough balancing act for Nvidia to use their engineering superiority to enable these great technologies, but also make them ubiquitous/accessible enough that devs will want to implement them. It's paradoxical that for RTX to succeed Nvidia needs AMD to be competitive using the same APIs.
 
I would be surprised if the next Xbox didn't support raytacing... nVidia took 10 years to develop this tech. It's in DX12 and several popular engines already have support. I think it will come to games faster than prior new tech has sometimes taken to be adopted. (I'm an optimist).

Now if EA could get DX12 support working reliably in Battlefront II...
 
I would be surprised if the next Xbox didn't support raytacing... nVidia took 10 years to develop this tech. It's in DX12 and several popular engines already have support. I think it will come to games faster than prior new tech has sometimes taken to be adopted. (I'm an optimist).

Now if EA could get DX12 support working reliably in Battlefront II...

It’s in BF5 and UE. I think it’ll be in a ton of games.
 
I did ask a contact over at NVIDIA about overclocking these cards and actually got a response.

Mileage will vary by chip... but the thermal headroom on the board and cooler is much larger than past FEs. The clock headroom is also pretty nice.

It seems that NVIDIA is also rolling out a new overclocking tool, referred to as "Scanner," of its own that is a bit more "sciency." No NDA signed by us, so that is all I got. :) Do I dare say we see 1900MHz+ clock speeds with good cooling again?
 
I was able to get Boost overclocked to 1860 MHz after thermal throttling on my Titan X with the blower, so maybe we could even push up to and maybe past 2 GHz with the 2080 Ti?

>2 GHz was hard on the last gen? I did not know this. My 1080 Ti FE did a hair over 2000 when I ran benchmarks and shit. Granted, I set the blower to 100% and it was louder than a freight train, but it worked without issue. Which brings up an interesting question, is the new cooler quieter than the freight train from last gen?
 
No, we havn't been too focused on pixel count really. Remember most of us are high end gamers here. 1440p gaming and ATTEMPTING to game at 4k are only just now really going any kind of mainstream.

These resolutions have finally reached a point where a lot of us are saying "ok, I'm not impressed by more pixel density anymore."

So this does seem like the right time to finally force a revolutionary change in pixel quality. If we can really have ray tracing quality at playable speeds I want it. Period. Love the green team or hate them, if they can deliver on this tech with playable games... hell yes we will want it. I think the only reason we aren't all going even more ape over the demos is because the price has us bummed. The demos shown so far are just plain amazing.
 
Voted and commented.
That being said, I'm still not sold on these price points. $600 for a founder's edition xx70- level card is kind of bonkers. By the time Ray tracing is really a thing, the next gen cards will be out, and buyers can better avoid early adopter's remorse.
As a 970 owner, I don't see myself filling to a10xx series card, so I might wait three generations this time.

Non FE cards will be cheaper, putting the 2070 at $499, so you'll get >1080 performance for the current 1080 msrp. That's not bad. Not great, but certainly not bad.

However, we need the games/hardware to push them, and I'm looking squarely at VR. My 1080 runs everything (2d and VR) I throw at it with 4x AA right now and I have little desire to go 4k as I'm all in on VR, and saw the law of dimishing returns when I went from 720p to 1080p on my projector, so unless we see corresponding VR headsets that can use all that power it's a massive pass for me. I have a feeling when get get headsets that need the power, the next gen 7nm parts will be close to launch.
 
People are complaing the 2070 coming in at 500-600. What people should be thinking is that they are getting a card that out performs a 1080Ti for less AND you get ray tracing acceleration of some type. NVidia is competing with.....themselfs! If you want to complain, AMD slacking off is a great place to start. Don't think ray tracking is all that? Well textured bumpmaping got the cold shoulder too. Matrox died but every game uses it now.
 
People are complaing the 2070 coming in at 500-600. What people should be thinking is that they are getting a card that out performs a 1080Ti for less AND you get ray tracing acceleration of some type. NVidia is competing with.....themselfs! If you want to complain, AMD slacking off is a great place to start. Don't think ray tracking is all that? Well textured bumpmaping got the cold shoulder too. Matrox died but every game uses it now.


You should google consumerism and study its basics. Also google monopolies.

People buying these cards because they have no other option and they want the best? OK. I'm ok with this.

People like you defending the price gouging for no other reason besides being a fan boy, not ok with this.
 
People are complaing the 2070 coming in at 500-600. What people should be thinking is that they are getting a card that out performs a 1080Ti for less AND you get ray tracing acceleration of some type. NVidia is competing with.....themselfs! If you want to complain, AMD slacking off is a great place to start. Don't think ray tracking is all that? Well textured bumpmaping got the cold shoulder too. Matrox died but every game uses it now.

AMD slacking off is indeed a problem. Now, whether or not the 2070 outperforms a 1080 Ti is unsettled. We have nVidia's word on this, no more. If Kyle says it does, then that's great. But even then, you're seeing prices going up generationally. x70 hits x80 prices. x80 hits x80Ti prices, etc... This means that while absolute performance is increasing, performance per dollar is not - at least, not as much. It's perfectly okay to think that kind of sucks, even if the hardware is drool-worthy. Ray-tracing is interesting. I have a feeling - reviews will confirm or deny - that it's not quite prime-time yet. First gen features rarely are. But it will open the door. Kind of like how the original GeForce brought hardware T&L to the table, but it didn't become a big deal until a generation or two later. Still, someone has to open the door. Kudos to nVidia for doing that. And if/when prices come down, I'm interested. Early adopter tax is rough.

I really like the DESIGN of this FE version, too. The simple, modern look of the card of itself is a huge improvement over the shrouds of the 10xx generation. Nvidia is getting really good at designing these things - and not just from a performance aspect.
 
  • Like
Reactions: ecktt
like this
I am still skeptical on the actual performance. If they are so proud of these cards that they think they are worth these huge prices.... then why are there no regular benchmarks? Even on games that favor them? I smell something fishy.
 
You should google consumerism and study its basics. Also google monopolies.

People buying these cards because they have no other option and they want the best? OK. I'm ok with this.

People like you defending the price gouging for no other reason besides being a fan boy, not ok with this.

Here's what. I'll read that, after you re-take basic mathematics.

Cheapest 180ti = $650
Cheapest Vega 64 = $600
2070 = $600

Yup, I'm a fanboy for pointing that out.
 
Here's what. I'll read that, after you re-take basic mathematics.

Cheapest 180ti = $650
Cheapest Vega 64 = $600
2070 = $600

Yup, I'm a fanboy for pointing that out.

You have no idea if the 2070 will beat the 1080ti. We don't know what the performance will be, not sure why you are defending its value.

1080ti is regularly 600 and below now.
 
You should google consumerism and study its basics. Also google monopolies.

People buying these cards because they have no other option and they want the best? OK. I'm ok with this.

People like you defending the price gouging for no other reason besides being a fan boy, not ok with this.
Even without monopolies everyone in the industry want bigger profit. Looking at nvidia titan AMD coming up with their Frontiers. If anything AMD also want people to pay premium on their cards but they simply don't have the brand image like nvidia. Worse we consumer will be hit with price fixing between company. And both AMD and nvidia already get caught for price fixing before.
 
I am still skeptical on the actual performance. If they are so proud of these cards that they think they are worth these huge prices.... then why are there no regular benchmarks? Even on games that favor them? I smell something fishy.

They're already going to be sold out for months. If they hype the 20 series up more people will wait and won't buy the 10 series cards already produced. The old cards are already faster than AMD's so they would only be hurting their own sales.

Aside from the new features It's probably just another 20-25% performance increase like it has been for the past several years.
 
They're already going to be sold out for months. If they hype the 20 series up more people will wait and won't buy the 10 series cards already produced. The old cards are already faster than AMD's so they would only be hurting their own sales.

Aside from the new features It's probably just another 20-25% performance increase like it has been for the past several years.
This is not a defense of not giving Hard performance numbers. They are massively overstocked on 10 series cards due to crypto’s colapse. The massive overpricing of the 20 series may help with that 10 series stock problem. I bet these crazy prices are due to the losses from crypto and them wanting to keep making that level of money to atleast temporarily satiate investors.
 
Last edited:
If the performance between the 2080 and the 2080Ti isn't super absurd, I'd probably settle for the 2080. 1100CAD vs 1600CAD based on the preorder pricing.
 
So no one really knows how it's going to really perform but everyone has already declared it's too expensive.

WTH has [H] the old tecchies 'I got to have the baddest shit out no matter the cost' persona gone? Sounds like everyone is now in the 'give me cheap shit or I'm going to trash you' mindset and no one has even tested it.

Amusing...
 
How long before we see dual 40mm rear exhaust fans on these huge slot graphics cards?? These new 3+ slot cards are starting to get rediculous ya'll. Just imagine:
gpu2-sys2.jpg
 
I am still skeptical on the actual performance. If they are so proud of these cards that they think they are worth these huge prices.... then why are there no regular benchmarks? Even on games that favor them? I smell something fishy.
Something fishy. But i think it is not performance related.
 
Looks like my 1080ti, that I bought on launch week, will be my best video card purchase, never had a video card this long and I may keep it a while longer.
Still happy with it at 3440x1440 and that price on 2080/2080ti is hard to justify.

If the performance between the 2080 and the 2080Ti isn't super absurd, I'd probably settle for the 2080. 1100CAD vs 1600CAD based on the preorder pricing.
I remember paying 950$ CAD for my 1080ti on launch. Seemed pretty high at the time. Increase the price by ~70% and I'm not even interested anymore.
 
Last edited:
Looks like my 1080ti, that I bought on launch week, will be my best video card purchase, never keep a video card this long and I may keep it a while longer.
Still happy with it at 3440x1440 and that price on 2080/2080ti is hard to justify.


I remember paying 950$ CAD for my 1080ti on launch. Seemed pretty high at the time. Increase the price by ~70% and I'm not even interested anymore.
Just wait it out. The cards are still in stock here in Canada. Let them rot. Don't pay the $200 retailer price gouging fee. They are charging too much here in Canada. Only pay MSRP. Just give it some time. There are rumors that there is a lot of Pascal inventory that has to clear through the channel before we really start seeing the Turing cards in stock, and I assume at fair prices. Probably a few months would be my guess. I expect fire sales on the Pascal cards soon.
 
So no one really knows how it's going to really perform but everyone has already declared it's too expensive.

WTH has [H] the old tecchies 'I got to have the baddest shit out no matter the cost' persona gone? Sounds like everyone is now in the 'give me cheap shit or I'm going to trash you' mindset and no one has even tested it.

Amusing...

When a midteir card has been able to drive most games at max you years, there's not a lot of drive. The only reason i'm considering upgrading cards is because VR is something that came along that is actually taxing my PC.
 
Will, no, but neither were physX and tesselation. But they aren't widely used to this day. Granted, Ray tracing is a little different in terms of potential Fidelity impact, but the point stands: by the time it is common enough to matter, the next gen will be here and have it polished.

I'm long overdue for a new video card, I'm rocking a 290 right now. I keep thinking is the 2080TI really worth it? I don't have VR, I game at 1080p and might move up to 1440p sometime. Do I really want to spend the money on a 2080 when I can just get a 1080 and then skip the 20xx gen.

I guess I'll wait for benchmarks.
 
So no one really knows how it's going to really perform but everyone has already declared it's too expensive.

WTH has [H] the old tecchies 'I got to have the baddest shit out no matter the cost' persona gone? Sounds like everyone is now in the 'give me cheap shit or I'm going to trash you' mindset and no one has even tested it.

Amusing...


GPU prices have been inflating big time for years. Remember the days when a top of the line card cost $250? Yeah, those are gone. There's a price point at which even the most hard core enthusiasts will cry uncle. That being said, I'm not sure we're there yet. But the trend sucks.
 
So no one really knows how it's going to really perform but everyone has already declared it's too expensive.

WTH has [H] the old tecchies 'I got to have the baddest shit out no matter the cost' persona gone? Sounds like everyone is now in the 'give me cheap shit or I'm going to trash you' mindset and no one has even tested it.

Amusing...

Some of us are still here. RTX 2080 Ti FE on order.
 
To the people justifying 1080ti purchases...no need. It's a great card, blazing fast. The problem with Nvidia and new tech is, they introduce a new feature then they hammer devs to exclusively use that feature. This will start rendering the 1080ti cards slower. Each game and benchmark release will make more and more exclusive use of Ray Tracing / DLSS and the AI (which is actually easier for devs vs non RT) which the 1080ti won't be able to do properly. This is whats been done before and will be done again. Think Hardware T&L, Tesselation, Hair Works, Phsyx, FSAA, etc. Nvidia really knows how to push devs to get stuff done which in turn renders a previous gen slower.
 
Last edited:
To the people justifying 1080ti purchases...no need. It's a great card, blazing fast. The problem with Nvidia and new tech is, they introduce a new feature then they hammer devs to exclusively use that feature. This will start rendering the 1080ti cards slower. Each game and benchmark release will make more and more exclusive use of Ray Tracing / DLSS and the AI (which is actually easier for devs vs non RT) which the 1080ti won't be able to do properly. This is whats been done before and will be done again. Think Hardware T&L, Tesselation, Hair Works, Phsyx, FSAA, etc. Nvidia really knows how to push devs to get stuff done which in turn renders a previous gen slower.

I personally think hardware T&L is the perfect comparison but for other reasons. I actually had a GeForce256 DDR at launch (skipped the SDR version) and it was left in the dust quickly. Good luck to early adopters though. I went with used 1080 TI.
 
Thanks for the breakdown. Exciting new cards, but I think that until Ray tracing becomes more common, the emperor still has no clothes given the price points. Comparing MSRP to gpus like the GTX 970 and 1070 against the 2070 showed what happens when competition wanes.

For those on the fence, I'd say they should for sure wait until real-world benchmarks come out. It'd be a tad foolish to buy it for RayTracing given it's a 1st Gen hype tech at the moment.

If you plan to do some A.I. based stuff alongside your gaming, it might be a good card to pick up providing NVLink is fully operational. These seem to be a good "starting point" card for those wanting to tinker with A.I. programs but don't want to break the bank on an expensive Tesla/Quadro card.
 
I personally think hardware T&L is the perfect comparison but for other reasons. I actually had a GeForce256 DDR at launch (skipped the SDR version) and it was left in the dust quickly. Good luck to early adopters though. I went with used 1080 TI.

I had a 256 SDR, it actually was quite fast with some serious overclocked when you fed the memory directly at 3.3v+. Just a little wire jumped over the mosfet to do it.
 
People like you defending the price gouging for no other reason besides being a fan boy, not ok with this.

It's not price gouging, it's pricing according to the supply/demand curves. Gouging would imply there was a freak increase in demand or decrease in supply (e.g. natural disaster) for an important/necessary good and the vendor is jacking up prices since the consumer has no choice but to buy the product anyway. You do not need the product to live. In most cases any card from the past 3 years is more than sufficient if money is an issue. You certainly are under no obligation to buy the product at what you feel is an extreme price. And if the cards sell out, that means they were priced too cheaply, as was the case during the mining boom.

I understand your frustration at pricing, but with people like you decrying corporations maximizing profits, not ok with this. These corporations employ thousands and spend millions if not billions on R&D which may or may not show a return. Look at Intel - they've shat the bed with their 10nm; companies aren't always successful. These are jobs and scientific advances vs. "can I buy a new gaming card without having to save more money". Your ire should be directed at AMD, Intel, etc. for lack of competent competition that could reduce prices.

Also remember there are parts of the product that are out of the producer's control wrt cost, the largest example being how the cost of memory is much, much higher today than it was two years ago when the 1080 generation released and they're now using a brand spanking new GDDR6 that is in high demand for more applications than your gaming. Blame AI. Blame autonomous vehicles. Blame the new mobile phone most people buy every 1-2 years. Blame all the cool stuff that's happening out there that I'm sure you're interested in but not connecting the dots with why your gaming card costs more money.

If you don't think the product is worth the money, don't buy the product, but there are smart people whose job it is to figure out how much to charge for a product based on what people will actually pay for it and how much is required for it's development and production to make financial sense for the company producing it. These cards will sell well, so it's up to you to decide if it's worth the money to you because it certainly will be to many others.
 
Don't give a shit about any of this really.

Just waiting for Kyle to get the cards and pit them against the older 1070ti / 1080ti in the latest games at 1440p and 4k.

I want to see the real life numbers and how big of an improvement it really is. From the looks of it now, the 1080Ti, used for $500, is looking amazing.
 
looking at the metro video... why the hell do the bright spots move as the camera moves? You see this a lot in many video games, lighting changes as perspective changes. It is really immersion breaking. I am not talking about god rays or shadows from the player, but the light shining through a window on the floor. It moves as if time of day is rapidly happening. but that does not change in other spots. annoying
 
I read an article on the RTX2080 on Techradar where they quote the price of a RTX2080 card as $699 (£749, AU$1,199). Since reading that article I have had emails from UK suppliers who have RTX2080 cards listed for pre order for just over £1000 ($1287) and the Ti card is £1344 ($1730). Am I alone in thinking that that is a ridiculous amount of cash for these cards ?
 
I read an article on the RTX2080 on Techradar where they quote the price of a RTX2080 card as $699 (£749, AU$1,199). Since reading that article I have had emails from UK suppliers who have RTX2080 cards listed for pre order for just over £1000 ($1287) and the Ti card is £1344 ($1730). Am I alone in thinking that that is a ridiculous amount of cash for these cards ?
Depends on whether or not they sell at those prices. I'm guessing they'll sell out initially at those prices and perhaps the retailers know something about a low initial supply? If the cards sell, the price isn't ridiculous, and we've seen that consumers are indeed willing to shell out what was previously thought of as "ridiculous" to buy the latest and greatest GPUs. The trend will only reverse when consumers stop buying the cards at elevated prices.
 
Back
Top