NVIDIA RTX 2080 and 2080 Ti Unclothed @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,596
NVIDIA RTX 2080 and 2080 Ti Unclothed

We get a look behind the svelt exterior of the new RTX 2080 and RTX 2080 Ti Founders Edition video cards. While pricing has got a lot of people turned off, the tech in these new cards is getting us a bit excited. NVIDIA has turned up its Founders Edition cards and even overclocked these. Will this push the AIBs even harder?

If you like our content, please support HardOCP on Patreon.
 
Thanks for the breakdown. Exciting new cards, but I think that until Ray tracing becomes more common, the emperor still has no clothes given the price points. Comparing MSRP to gpus like the GTX 970 and 1070 against the 2070 showed what happens when competition wanes.

Also given the treatment of [H], I'm hesitant to give team green any more money (after my GPU history of a 9800gt, GTX 260, 460, 660, 660ti boost, and now a 970, I'm a huge Nvidia fan). Truth matters, and punishing a reviewer for publishing the truth is wrong and hurts me as a consumer.

Fps vs principals.... Hrrm....
 
Thanks for the breakdown. Exciting new cards, but I think that until Ray tracing becomes more common, the emperor still has no clothes given the price points. Comparing MSRP to gpus like the GTX 970 and 1070 against the 2070 showed what happens when competition wanes.

Also given the treatment of [H], I'm hesitant to give team green any more money (after my GPU history of a 9800gt, GTX 260, 460, 660, 660ti boost, and now a 970, I'm a huge Nvidia fan). Truth matters, and punishing a reviewer for publishing the truth is wrong and hurts me as a consumer.

Fps vs principals.... Hrrm....
Agree 100%. Have no trouble with my 1080 Ti's, even at 4k. Gonna sleep this one out until they get real on the pricing.
 
No [H]ardOCP review; no sale.
I don't think Kyle bought two copies of each card for his health, though no doubt both RTX 2080 Ti are going to go into his personal system afterward.
 
I'm sure they are great cards. They asking just a little to much for those though.

2070 at 1080 release price, 2080 at 1080ti release price, and 2080ti is at the Titan price point.

Looks like my Vega 64, and 1080ti gonna keep on rocking for awhile.
 
Any idea when you guys will get a review sample? Or are they going to wait til full launch?

[H] will (very likely) not get review samples (page 1: "While NVIDIA still will not talk to HardOCP because of us telling your the truth about GPP")

Kyle has written he pre-ordered 4 retail units, so assume those will be used for testing and reviews.

No [H]ardOCP review; no sale.

I don't think Kyle bought two copies of each card for his health, though no doubt both RTX 2080 Ti are going to go into his personal system afterward.

https://www.hardocp.com/news/2018/08/21/nvidia_offers_hardocp_2080_launch_access/
 
That whole presentation nvidia did was a bit of a farce, tech tech tech tech tech, and virtually zero on performance, for good reason. As to be expected the first iteration of a card with ray tracing is virtually barely able to run it at anything over 1080p. Suppose that would depend on a game by game basis but it speaks volumes about why they kept banging on about the tech involved for over an hour and only made vague mentions of performance that didn't really say much at all. "6x previous generation", yeah what previous generation? The 480 fermi? The 7800gtx? Could they be more vague?

Seems like this card is more about the tech crammed into the core than raw performance, so i don't expect large gains over the 1080ti.
 
That whole presentation nvidia did was a bit of a farce, tech tech tech tech tech, and virtually zero on performance, for good reason. As to be expected the first iteration of a card with ray tracing is virtually barely able to run it at anything over 1080p. Suppose that would depend on a game by game basis but it speaks volumes about why they kept banging on about the tech involved for over an hour and only made vague mentions of performance that didn't really say much at all. "6x previous generation", yeah what previous generation? The 480 fermi? The 7800gtx? Could they be more vague?

Seems like this card is more about the tech crammed into the core than raw performance, so i don't expect large gains over the 1080ti.

Did you miss the 4k demo?? It was mentioned they had to lock the FPS to 60 due to the screen (they said it was able to run at 80 FPS).. They said the current Titan XP was ~30 FPS (I may not have this number right) with same settings.
 
Thanks for the breakdown. Exciting new cards, but I think that until Ray tracing becomes more common, the emperor still has no clothes given the price points. Comparing MSRP to gpus like the GTX 970 and 1070 against the 2070 showed what happens when competition wanes.

Also given the treatment of [H], I'm hesitant to give team green any more money (after my GPU history of a 9800gt, GTX 260, 460, 660, 660ti boost, and now a 970, I'm a huge Nvidia fan). Truth matters, and punishing a reviewer for publishing the truth is wrong and hurts me as a consumer.

Fps vs principals.... Hrrm....

The impression given was that ray-tracing was actually simpler to code than the current process.. so adding it is actually a simplification to the developer vs. making things harder/more complex. I dont think this is going to be a gimick unless the drivers Nvidia have aren't up to the task.
 
I think the determining factor for me on whether I buy the 2080TI or not will be two fold. The first, like most people here, will be the [H] review. The second will be how my 1080 runs Cyberpunk 2077 and/or the benefits the 2080 bring to that game.
 
The impression given was that ray-tracing was actually simpler to code than the current process.. so adding it is actually a simplification to the developer vs. making things harder/more complex. I dont think this is going to be a gimick unless the drivers Nvidia have aren't up to the task.
Simpler from an absolute standpoint, maybe. But devs are used to the current workarounds. And it still required implementation first into engines and then into games. And that takes time. Any game currently under development that hasn't already included the tech probably won't, which means not many games will have it for a while. I'm thinking it may be similar to PhysX, the various hair technologies, etc, where it is a selling point for some AAA games but takes a long time to be regularly used.
 
Did you miss the 4k demo?? It was mentioned they had to lock the FPS to 60 due to the screen (they said it was able to run at 80 FPS).. They said the current Titan XP was ~30 FPS (I may not have this number right) with same settings.


Yeah it's a tech demo and it hitched to 47fps at one point. And obviously their big talking point of RT wasn't enabled in it either. They also cut it short as it runs for over 3 minutes so it's possible there were instances where it didn't maintain 60fps for a period of time. I'm going to dload it and see how it runs. Regardless it wasn't a game, just a tech demo.
 
Simpler from an absolute standpoint, maybe. But devs are used to the current workarounds. And it still required implementation first into engines and then into games. And that takes time. Any game currently under development that hasn't already included the tech probably won't, which means not many games will have it for a while. I'm thinking it may be similar to PhysX, the various hair technologies, etc, where it is a selling point for some AAA games but takes a long time to be regularly used.

Thats true. NVidia also has the bandwidth and money to partner with game engine suppliers to make sure its there for new titles, but yeah, existing may be SOL. I suppose the silver lining here is that the 2080ti appears to be a massive performance increase vs. existing 1080 ti and Titan XP. One other big item I see not being discussed is the role A.I. is going to play in the rendering pipeline on this gen. That could be a very huge deal too. Time will tell for sure... If its a bust, at least people can still buy Pascal gen cards.
 
I really just want to see real game performance. All we were told is how many RTX OPS the cards can do compared to previous gen. There's 2 possibilities here. 1) The cards aren't that much faster than Pascal unless you have all the RTX stuff turned on, or 2) All the extra AI cores and whatnot actually provide a benefit outside of RTX situations and we get a pretty good performance bump. I'm leaning towards option 1 due to lack of them mentioning anything about option 2 in their conference, but I'm patient enough to wait and see.
 
Kind of ironic seeing people complaining about an entirely new path towards better graphics.

I don't like the prices, I get it, but I also understand that a decade to develop a technology isn't cheap. Either you want someone to push the envelope, or you don't.

If you do, then that takes massive investment and time. Don't be surprised if the company tries to recoup that investment and make as large a profit as possible.
 
I'm extremely tentative on the whole NVLink pooling memory and the likelihood of increased multi-gpu support here since there hasn't really been a peep about it from Nvidia marketing. Seems like it would be a really easy way to rack up 5,000 or 10,000 extra gpu sales by people intrigued to give SLI a go again, but nary a word about it.

If I were to place a bet, they streamlined things with only 1 multi-gpu bridge port instead of 2 adopting the NVlink connection style on consumer cards, but that would be as far as it goes with the memory pooling and such being locked down at the driver level.

Edit:

TL:DR - Probably just the same as the old SLI bridge but using NVlink connectors unless you wanna shell out Quaddro cash on it.
 
Yeah it's a tech demo and it hitched to 47fps at one point. And obviously their big talking point of RT wasn't enabled in it either. They also cut it short as it runs for over 3 minutes so it's possible there were instances where it didn't maintain 60fps for a period of time. I'm going to dload it and see how it runs. Regardless it wasn't a game, just a tech demo.

So..... The tech demo that was locked to 60 FPS but does 80 FPS at 4K when a current gen 1080ti does the same benchmark ~35 FPS, with the same settings, is not a performance measure because....... its a benchmark and not a game???? I am not trying to poke fun at you here, just trying to follow what point you are trying to make...

There were ray-tracing performance measurements compared to current gen 1080 ti's as well which said the 2080 ti was over 10x performance.

I do see your point of the general lack of discussing FPS in games and that could mean one of three things... They either thought the tech behind what we were seeing was more impactful than the FPS and they didnt want the insane FPS to steal the show on the all work they did to make it happen. The FPS in current games wasnt a whole lot better vs. current gen.. Or the new games demo'd with new tech are simply not "ready" to have a fair representation of actual game play, so they left it out as its simply not relevant to what you will see when the games and products are released.

The bottom line is we will all know for sure in less then a month, which is not too bad.....
 
No HDMI 2.1 .....how disappointing, wanting to break away from brand specific variable refresh rate and the like. The negatives for me just keep piling up.
 
No HDMI 2.1 .....how disappointing, wanting to break away from brand specific variable refresh rate and the like. The negatives for me just keep piling up.

Did you honestly expect Nvidia to provide people a path away from their Gsync tech? That would be terrible for their business.
 
If the performance justifies the price then I “would” buy.

But I am still happy at 1440/144hz and a 1080ti.
 
the 1080Ti is going to end up the real winner here...people were waiting for the 2000 series but now will fall back on the 1080Ti

Just bought a used EVGA GTX 1080 SC for $300 on eBay. Been watching GTX 1060 6GB, GTX 1070, GTX 1070 Ti, and GTX 1080 prices on eBay - all looking pretty damn good, especially the 1060 6GB if you're willing to "settle".

Granted - still have a 2080 Ti on pre-order... :)
 
Did you honestly expect Nvidia to provide people a path away from their Gsync tech? That would be terrible for their business.
I expect them to support industry standards, yes. I also expect them to up the anti and be competitive and pro consumer, or they can get bent, they want G-Sync to still be a thing then improve it and make it worth it. I vote with my wallet and these things matter more to me than a few FPS on a pixels game.

Edit: I'll add that the monitors i'm most interested in, a thing that makes a greater impact to my enjoy of more things than just the game, including the game, has never been in G-Sync format.....SO YEA.
 
The games don’t lie. Anand had their hands on and they said 1080p with RTX features in BFV had frame drops. Not good performance. Anyone who has been tinkering with graphics cards should know that the first card to feature brand new tech usually sucks at it. Anti-aliasing took years to perfect. 4K? We are just now getting it on a single card, really. Ray tracing, okay, this card does it, but if frames in BFV are gonna tank because of some reflections, why even consider this card at that price point.

GTX1080ti owners seem like the winners here—ride it out.

If BFV turns into some kind of masterpiece then maybe I’d jump into Ray tracing when they can get it at 60fps minimum. But I will be damned if I spend nearly twice the money of a GTX1080ti so I can tank my frame rates with some fancy shadows and reflections.

The pricing truly is out of whack for something that is obsolete out of the gate. I never really considered most PC gamers to be overflowing with cash.

Some graphics cards make a compelling case but this ain’t one of them.
 
The games don’t lie. Anand had their hands on and they said 1080p with RTX features in BFV had frame drops. Not good performance. Anyone who has been tinkering with graphics cards should know that the first card to feature brand new tech usually sucks at it. Anti-aliasing took years to perfect. 4K? We are just now getting it on a single card, really. Ray tracing, okay, this card does it, but if frames in BFV are gonna tank because of some reflections, why even consider this card at that price point.

GTX1080ti owners seem like the winners here—ride it out.

If BFV turns into some kind of masterpiece then maybe I’d jump into Ray tracing when they can get it at 60fps minimum. But I will be damned if I spend nearly twice the money of a GTX1080ti so I can tank my frame rates with some fancy shadows and reflections.

The pricing truly is out of whack for something that is obsolete out of the gate. I never really considered most PC gamers to be overflowing with cash.

Some graphics cards make a compelling case but this ain’t one of them.


Agreed sort of like tessellation on the HD 5800/GTX400 series. If you bought for that feature you got to be disappointment but by the next gen both companies had it ironed out.
 
Anyone remember the 8800 Ultra launch, with Unified Shaders (now used by every GPU)? Or the good old GeForce 256 (hello there hardware texture & lighting)?

I'm more surprised so many engines have raytracing support in place already (though Unity appears to be noticeably absent). Then again, Nvidia know how to launch a new concept while avoiding Catch 22, as we can see with the success of CUDA.
 
Simpler from an absolute standpoint, maybe. But devs are used to the current workarounds. And it still required implementation first into engines and then into games. And that takes time. Any game currently under development that hasn't already included the tech probably won't, which means not many games will have it for a while. I'm thinking it may be similar to PhysX, the various hair technologies, etc, where it is a selling point for some AAA games but takes a long time to be regularly used.
Real time ray tracing isn't some proprietary tech like PhysX and Hairworks. It's built into the DX12 API now as has been said ad nauseam elsewhere, and it's coming to Vulkan. Major game engines like Unity and Unreal Engine 4 also have it built-in already. The bump in the road up until now was it still being too expensive with existing hardware. RTX is the jumping off point for that. It'll be slow to adapt, true, but all new graphic technologies take time to reach ubiquity. Ray tracing is a game changer both in visual effects and actual development, so it is definitely no gimmick.
 
The games don’t lie. Anand had their hands on and they said 1080p with RTX features in BFV had frame drops. Not good performance. Anyone who has been tinkering with graphics cards should know that the first card to feature brand new tech usually sucks at it. Anti-aliasing took years to perfect. 4K? We are just now getting it on a single card, really. Ray tracing, okay, this card does it, but if frames in BFV are gonna tank because of some reflections, why even consider this card at that price point.

....

Some graphics cards make a compelling case but this ain’t one of them.

The better answer is to 'wait for the [H] review'. This is all speculation at this point.
 
Back
Top