RTX 2070 or GTX 1080 Ti or Navi For Gaming?

Sithtiger

Weaksauce
Joined
Jun 23, 2016
Messages
96
So, I was waiting for the new Nvidia GPU's to drop. Was waiting for the successor to the GTX 1070 (which is what I have), but I'm hearing a lot of rumor that the RTX 2070 Turing (along with the rest of the RTX line) runs slower regarding FPS vs Pascal.

OK, so Ray Tracing will be great one day, but right now, not so much. What I'm most concerned with is FPS, CUDA cores, and Tensor cores. So the Titan Volta destroys games but has not RT ability. I do want RT, but not at the expense of framerate. So what I'm asking is, will the RTX 2070 be able to outperform my GTX 1070? If they've cannibalized the RTX 20xx line, just for Ray Tracing implementation, I'll be furious! If RT is going to limit my framerate to 60 FPS, not to mention cause it to drop lower as if it were 5 years ago, then I don't want it!

I would think that Nvidia and game devs would put an RT on and off switch to allow higher framerate if you don't want it. And I don't want it if they've gutted 2070 just to make RT available. Right now, I care more about FPS, than RT cores until games look like the Star Wars demo, and then I want at least 100+ FPS in games. Should I just get the GTX 1080 Ti or will the RTX 2070 be as fast or faster in a direct comparison? Does anyone even know what an RTX 2070/2080 would get with current games in terms of framerate?

OK, so I just looked over the specs and 2070 has 2304 CUDA cores vs. 1920 for 1070, so 2070 should easily be faster. Now, we just need the ability to disable RT, if we so choose, because I'm NOT playing BFV at 30 FPS. RT is cool, but not at the expense of framerate, especially at 30 FPS. I've heard others say it was 60 FPS at the event, but weren't they using an RTX 2080 Ti too? Anyway, what do you guys think?

Oh and is Intel releasing any new CPUs besides from the low powered CPU's? I'm looking for performance, not power saving.
 
Waiting is really your best bet. You can then decide based upon the actual performance reviews. Be prepared though, if the 2070 is a beast then you will need to be quick on the trigger to secure an order. If it's not, then it's not and you can either sit on that 1070 for another gen or trade it up to a faster 1080/Ti.

To your question about how Turing is slower than Pascal, I doubt that this is the case... and if RT doesn't come with a sw disable switch then there will be a mass uprising... ;)
 
No one really knows until we get some benchmarks/reviews from reputable sources. Just wait until then and the decision should be much simpler for you.
Yeah, that's what I'm going to do. I think if it isn't as fast regarding CUDA cores and the rest, that would be a MASSIVE mistake and I can't believe Nvidia would do that because that'd be equivalent to recommending people to buy AMD GPU's because until RT becomes standard, Nvidia will be slow in most games because we're going to FORCE the industry into putting RT into all gaming.

RT does indeed seem to be the future, but I didn't like Nvidia's "not so subtle" way of trying to get us to buy into it. Honestly, these first gen games with RT in them will suck. It would be like trying to run Battlefield V on a GTX 280 or something getting only 30 FPS. That would be horrible. I can't believe Nvidia would be that stupid. I know Nvidia's great at strongarming vendors and devs, but they just tried SO hard to get people excited about RT. Yes, RT "WILL BE" amazing, but right now I'd rather run games at 1440p getting 150 FPS vs 30 FPS with RT on! God, 1080 Ti's are so cheap (compared to what they were) right now. I just a little worried if the 2070 turns out to be like some people believe it to be and I wait too long, the 1080 Ti's will be gone. I don't think that's the case though, because the specs seem to indicate that it should run faster. I don't like how they were comparing Turing to Pascal using RT as baseline. That's like comparing Nvidia's first card with optimized shaders to another card without shaders in mind.

Waiting is really your best bet. You can then decide based upon the actual performance reviews. Be prepared though, if the 2070 is a beast then you will need to be quick on the trigger to secure an order. If it's not, then it's not and you can either sit on that 1070 for another gen or trade it up to a faster 1080/Ti. Yeah, gonna wait for sure. As I said in the reply above how Nvidia comparing Turing to Pascal in Ray Tracing performance. That's unfair AND a little suspect, since they didn't compare Turing and Pascal in Shader performance or Rasterization....etc.

To your question about how Turing is slower than Pascal, I doubt that this is the case... and if RT doesn't come with a sw disable switch then there will be a mass uprising... ;)
Yes, there will be. If that's true, then suddenly AMD's GPU's would be faster. The 1080 Ti would be the card to get. By the time RT is a normal thing, even the RTX 2080 Ti will be laughably slow compared to say, the RTX 4080 Ti.
 
Yeah, that's what I'm going to do. I think if it isn't as fast regarding CUDA cores and the rest, that would be a MASSIVE mistake and I can't believe Nvidia would do that because that'd be equivalent to recommending people to buy AMD GPU's because until RT becomes standard, Nvidia will be slow in most games because we're going to FORCE the industry into putting RT into all gaming.

RT does indeed seem to be the future, but I didn't like Nvidia's "not so subtle" way of trying to get us to buy into it. Honestly, these first gen games with RT in them will suck. It would be like trying to run Battlefield V on a GTX 280 or something getting only 30 FPS. That would be horrible. I can't believe Nvidia would be that stupid. I know Nvidia's great at strongarming vendors and devs, but they just tried SO hard to get people excited about RT. Yes, RT "WILL BE" amazing, but right now I'd rather run games at 1440p getting 150 FPS vs 30 FPS with RT on! God, 1080 Ti's are so cheap (compared to what they were) right now. I just a little worried if the 2070 turns out to be like some people believe it to be and I wait too long, the 1080 Ti's will be gone. I don't think that's the case though, because the specs seem to indicate that it should run faster. I don't like how they were comparing Turing to Pascal using RT as baseline. That's like comparing Nvidia's first card with optimized shaders to another card without shaders in mind.

Yes, there will be. If that's true, then suddenly AMD's GPU's would be faster. The 1080 Ti would be the card to get. By the time RT is a normal thing, even the RTX 2080 Ti will be laughably slow compared to say, the RTX 4080 Ti.

Also another downside is apparently 2070s won't be shipped until November time frame.
 
The 2070/2080/2080ti will all be faster than their counterparts (they all have 15-21% more CUDA cores) in traditional games. If a game implements DLSS they’ll be way faster.

Also probably other features we don’t know about yet that make them faster.

The reviews drop in a week or two. Just wait and see the delta between the 2080 and 1080. The 2070 and 1070 will be the same delta.
 
Also another downside is apparently 2070s won't be shipped until November time frame.

Yeah, I noticed that. I'm not real happy about that, but Nvidia's PR guy Tom Petersen stated on HotHardware's Geeks webcast that he claimed that "The Turing lineup, in general, will perform compared to Pascal. Stating that gamers should expect roughly between 35 - 45% better performance going from Pascal to the same tier Turing graphics card." I'm gonna trust what he says because either way I'll know when the reviews come out, which will be before the cards come out. If for some reason, Nvidia is lying and the GTX 2700 is crap using CUDA cores, I won't get it, but the GTX 1080 Ti. I doubt that though. Even though the tensor cores need optimizations and drivers, we already know Tensor cores are amazing. I know the Titan V just decimated the GTX 1080 Ti, even with the Tensor cores disabled. When they were enabled, they're just so devastatingly fast. I know the 2070 doesn't have nearly the amount of Tensor cores, but the tests will confirm everything.

This is all REALLY frustrating because I only have $1,200 to spend. This next upgrade is important. If I had the money, I'd get the RTX 1080 Ti and the i9 9900k, but there's no way. I can get the RTX 2700 though...assuming it's worthy. I can also get the 9600k, but I'm pretty sure I'll have to wait and I don't think it's worth waiting for. The CPU's are identical when talking about the i5th.
 
The 2070/2080/2080ti will all be faster than their counterparts (they all have 15-21% more CUDA cores) in traditional games. If a game implements DLSS they’ll be way faster.

Also probably other features we don’t know about yet that make them faster.

The reviews drop in a week or two. Just wait and see the delta between the 2080 and 1080. The 2070 and 1070 will be the same delta.

Yeah, I believe they'll be faster too. Either way, the GPU will be the easiest. I WILL wait for the 2070 if it's not available. I can't afford to wait for the i5 9600k unless I see I have enough and/or it's vastly superior to the 8600k, but I seriously doubt it. I'm betting the 8600k, maybe even the 8700k has a nice sale on it, which will make getting the 8th gen worth getting!
 
If I have to speculate, I would say that the Geforce RTX 2080 is barely faster than the Geforce GTX 1080 Ti and that the Geforce RTX 2070 is way slower.
 
If I have to speculate, I would say that the Geforce RTX 2080 is barely faster than the Geforce GTX 1080 Ti and that the Geforce RTX 2070 is way slower.

Yea 2070 is going to be around 1080 and 2080 is going to be around 1080ti. I think we are going to get mostly performance from added boost clocks, more bandwidth from GDDR6 and 21% more cuda cores. Basically more price and performance less than previous gen. Nvidia is basically selling on DLSS and Ray tracing. Ray tracing is already a no go for me until a few generations. Early adopters will shell out the cash. I am not sure I want to spend 1200 on a card to play at 1080p for some nicer lights and shadows. I wonder if 2070 and 2080 will be even able to handle ray tracing. 2080ti barely gets 1080p going. It may since it really brakes down to number of RTX cores. I think we are looking at 2 or 3 generations out before we truly get ray tracing at high fps.
 
For anyone who's still following this thread, I've got an update. Ok, so, I was able to make some more money was able to upgrade my whole system, although I have to wait on the GPU for a couple months. I think this will actually be a good thing in the long run since the 2080's and especially the 2080 Ti's had an overheating problem with their GDDR6 memory. Here's what I bought and can use as soon as it comes in tomorrow. I bought an i5 9600k, Asrock Z390 Phantom Gaming SLI/ac and G.SKILL Ripjaws V Series 16GB DDR4 3200 (PC4 25600) Model F4-3200C16D-16GVGB RAM. As I said before, I have to wait a couple months to buy an RTX 2080. I'm gonna get the MSI GeForce RTX 2080 GAMING X TRIO Video Card. Core clock of 1515 MHz and a Boost clock of 1860MHz. It currently costs $849.00. Aside from the price, which I can get, I just need to save for a couple of months, the size of this card is a problem. I have a CM HAF 912. The card is 327 mm or 12.87" inches. I know I'll have to remove one of my drive cages, but I still wonder if it will fit? According to PC Partpicker, it will, but can anyone verify that? Also, this card has a perfect 5 out of 5 eggs at Newegg. That will probably change, but it's the highest rated RTX 2080 at Newegg at this time.

The other choices currently are: the EVGA GeForce RTX 2080 BLACK EDITION GAMING for $760.00. It's only 269 mm or 10.60" inches and the cheapest of the bunch. It's got the lowest boost clock at 1710 MHz. There are two other EVGA RTX 2080 cards, same design, the only difference is one has a Boost clock of 1800 MHz for $800 and the other has a Boost clock of 1815 MHz for $850.00. I'll take one of those 4 cards. I actually would like
 
I dont understand people buying 6c/6t CPUs in late 2018.. reasoning that its beyond my mind just to end with bottlenecked gpus..
 
I dont understand people buying 6c/6t CPUs in late 2018.. reasoning that its beyond my mind just to end with bottlenecked gpus..
The 9600K beats Ryzen and even Intel's older 8-core in gaming.

index.png
 
The 9600K beats Ryzen and even Intel's older 8-core in gaming.

View attachment 121965

I think real detail is in the system configuration. We know 9600k is likely boosting 700-800mhz higher than ryzen on all cores and not sure what settings they are using for RAM on the ryzen since that makes a big difference. In either case the new 9000 series is going to be better since its runs much faster than any other processor and its boosting almost all cores automatically.

I do think AMD will get much closer to intel in gaming with zen 2. Since they should get in to high 4ghz range on 7nm.
 
Yes the i5 k cpu's have always been good for gaming. Heck even the i3's can beat the 2700X at 1080P in some games.
 
A nice info from this video about Navi GPUs. Those new info are on line with what we already knew from rumours months ago. Small Navi is coming in Q2/2019 and it is taking the Polari's place in price and the Vega's place in performance. Big Navi will batle with nVidia high-end in late 2019.

 
A nice info from this video about Navi GPUs. Those new info are on line with what we already knew from rumours months ago. Small Navi is coming in Q2/2019 and it is taking the Polari's place in price and the Vega's place in performance. Big Navi will batle with nVidia high-end in late 2019.


Late 2019? Wow Nvidia will already be right around the corner again. This thing better be amazing.
 
If they price it right they will sell well I think. It shouldn't be too hard to do given the RTX cards current pricing.
 
Back
Top