Should I wait till summer for the GTX 1180 or get a 2070 or 2080 now ?

Subzerok11

Gawd
Joined
Aug 13, 2014
Messages
550
See sig for PC specs. I currently got a RTX 2060 but I want a little more fps, I'm returning it to the store this week by the way. Should I get a 2070 for 10-15% more fps but for about $200 more. Or get the 2080 and spend $350 for 30-40% increase in fps. The thing I'm worried about is the resale value of these RTX cards. Rumors have it that a GTX 1180 is going to drop this summer and without ray-tracing it might be only $500-550 and if ray-tracing fails then my resale value of my 2080 won't be very good on eBay. $700+ for a card is so dam much, I kinda feel guilty just thinking about spending $700+ on a GPU. I won't even be using ray-tracing most of the time cause I'd rather get as high fps as possible anyway, so I hate to spend on a feature I won't use. Maybe I should do without my GPU for a while till I see how things shake out. I'll just use the integrated GPU in my CPU for now ?
 
There will be a 7nm shrink in Nvidia's future. When that is really depends on Samsung. It could be out later this year, depending on how quickly they can ramp-up large dies.

Raytracing is here-to-stay. Raster graphics has hit a wall, and RT is the only way left to go. But you can feel free to wait until second gen before you upgrade.

The RTX 2060 should be able to handle 1440p at mid/high for the rest of the year.
 
Last edited:
Different classes of cards. If you don't need the RTX features, I'd probably wait for the 11xx ones.
 
See sig for PC specs. I currently got a RTX 2060 but I want a little more fps, I'm returning it to the store this week by the way. Should I get a 2070 for 10-15% more fps but for about $200 more. Or get the 2080 and spend $350 for 30-40% increase in fps. The thing I'm worried about is the resale value of these RTX cards. Rumors have it that a GTX 1180 is going to drop this summer and without ray-tracing it might be only $500-550 and if ray-tracing fails then my resale value of my 2080 won't be very good on eBay. $700+ for a card is so dam much, I kinda feel guilty just thinking about spending $700+ on a GPU. I won't even be using ray-tracing most of the time cause I'd rather get as high fps as possible anyway, so I hate to spend on a feature I won't use. Maybe I should do without my GPU for a while till I see how things shake out. I'll just use the integrated GPU in my CPU for now ?

That's a nonsense rumor. There won't be a GTX 1180.

The only reason there will be GTX 1660 and below is because RTX HW would require too large a percentage of die space for this class of card.
 
That's a nonsense rumor. There won't be a GTX 1180.

The only reason there will be GTX 1660 and below is because RTX HW would require too large a percentage of die space for this class of card.

Totally agree with Snowdog here. There WON"T be any 1100-series anything cards, all we have to look forward to is the 2000 and 1600 series bullshit from nVidia.
They've stopped producing the 1000-series cards, so now why would they wanna nerf their product into the ground, as they still need to make money
on what they're making now (or have TSMC making their GPUs for them now).
 
That's a nonsense rumor. There won't be a GTX 1180.

The only reason there will be GTX 1660 and below is because RTX HW would require too large a percentage of die space for this class of card.


Never say never, I get a small feeling it will happen.
 
Totally agree with Snowdog here. There WON"T be any 1100-series anything cards, all we have to look forward to is the 2000 and 1600 series bullshit from nVidia.
They've stopped producing the 1000-series cards, so now why would they wanna nerf their product into the ground, as they still need to make money
on what they're making now (or have TSMC making their GPUs for them now).


"There WON"T be any 1100-series anything cards"

That's your opinion, you never know we will see this summer. 1600 series is bullshit ? How is the 1600 series bullshit, it's the true successor to the 1060 gpu. It's a mainstream card that 1080p users are waiting for, it's basically a 1070 for 1060 price. You do know that 1080p is still by far the most popular resolution right ?
 
I'm looking at my crystal pyramid...

upload_2019-2-24_16-49-13.png


It said you shouldn't wait.
 
I'll take logical reasoning over your "feelings" every time.
Here is some logical reasoning. After five months how many games have RTX? One. Not even the Tomb Raider RTX patch, promised long ago, is out.

Raytracing will only be here to stay when the consoles get it.
 
That's a nonsense rumor. There won't be a GTX 1180.

The only reason there will be GTX 1660 and below is because RTX HW would require too large a percentage of die space for this class of card.
Not only that, but such a card would eat into RTX sales pretty bad, something Nvidia would not likely do.
 
Here is some logical reasoning. After five months how many games have RTX? One. Not even the Tomb Raider RTX patch, promised long ago, is out.

So after 10 years of planning and work, they should just give up when RT doesn't set the market on fire after 5 months? This is how losers think.

A x80 card without RT, is NVidia giving up on RT after only a few months, which doesn't make any sense.

This transition was always going to be rough one with a lot of silicon devoted to features that don't have much support, and a chicken and egg problem getting software support, but RTX first gen, is really the setup for the next gen, 7nm cards that come out 18-24 months after with even better RT HW, and crucially some actually software when they arrive, after being seeded by this rocky generation.
 
Anythng above a 1660ti level product performance would cut into RTX sales, you can forget about it. Component manufacturers fill market gaps, they tend not to overlap with new products if it is not a clear market segment they are addressing.

If you don't care about RTX, get a used 1070ti or 1080. I got a 1070ti for $280, best bang for buck IMO. You can get 1080s for $350 Ti-s for $400, if you hunt well.
 
Never in the modern 3D gaming history of mankind has anyone ever gone back on a DircctX API change, or left it out of high-end hardware.

Since DX5, nothing added has ever been removed from the spec, due to it missing the mark on initial release. That includes programmable shaders in DX8, and unified shaders in DX10, or tessellation in DX11.

The GeForce 3 sold like shit on release as well, and Nvidia's solution was to lower prices as increased production lowered costs, rather than release a high-end part without shaders.
 
Last edited:
Never say never, I get a small feeling it will happen.

https://www.tomshardware.com/news/nvidia-geforce-gtx-1180-hp-omen-obelisk,38669.html

Update 2/22/19: HP representatives confirmed that the GTX 1180 branding was used as a placeholder while the company awaited the finalized branding for the RTX 2080 graphics cards. HP updated its specifications in press materials at the launch of the OMEN Obelisk Desktop, but the official datasheet on HP.com still erroneously listed the GTX 1180 at the time of publication.

nVidia will do whatever brings them the most bundles of our cash... if at one point calling a card 1180 brings them the most, it'll happen...
 
Never in the modern 3D gaming history of mankind has anyone ever gone back on a DircctX API change, or left it out of high-end hardware.

Since DX5, nothing added has ever been removed from the spec, due to it missing the mark on initial release. That includes programmable shaders in DX8, and unified shaders in DX10, or tessellation in DX11.

The GeForce 3 sold like shit on release as well, and Nvidia's solution was to lower prices and increase performance with a die shrink (GeForce 4), rather than release a high-end part without shaders.

This makes the most sense. We all know when the 7nm refresh comes out it's going to add a nice bit of performance. And until then there's lots of room for price cuts.
 
So after 10 years of planning and work, they should just give up when RT doesn't set the market on fire after 5 months? This is how losers think.
Throwing good money after bad is how bankrupt companies think.
 
Ray tracing is obviously the future.

However, it will never be mainstream if it is locked to specific high-end cards from one vendor.

I'd expect the features will eventually come to lower-end cards (and hopefully AMD too) but for now it is a luxury.
 
Never in the modern 3D gaming history of mankind has anyone ever gone back on a DircctX API change, or left it out of high-end hardware.

Since DX5, nothing added has ever been removed from the spec, due to it missing the mark on initial release. That includes programmable shaders in DX8, and unified shaders in DX10, or tessellation in DX11.

The GeForce 3 sold like shit on release as well, and Nvidia's solution was to lower prices as increased production lowered costs, rather than release a high-end part without shaders.

True, but plenty of DX features have been pretty much ignored.

How long did we sit on majority DX9 titles, even after DX10 and 11 and 12 were released?
 
I feel like DX9 was such a break-though at the time, while the newer APIs have been incremental and not offered a substantial difference.

I mean, some games today support DX11 and DX12 but they look exactly the same. There is no visible benefit, and sometimes performance is even slower. Kind of a hard sell.

At least with DOOM and Wolfenstein using Vulkan there are tangible performance gains, especially for older hardware. This is what was promised but hasn't largely materialized.

Maybe ray-tracing will be the jump we are looking for. At least the graphics actually *look* better, there is no denying that (performance concerns aside).
 
https://www.tomshardware.com/news/nvidia-geforce-gtx-1180-hp-omen-obelisk,38669.html

Update 2/22/19: HP representatives confirmed that the GTX 1180 branding was used as a placeholder while the company awaited the finalized branding for the RTX 2080 graphics cards. HP updated its specifications in press materials at the launch of the OMEN Obelisk Desktop, but the official datasheet on HP.com still erroneously listed the GTX 1180 at the time of publication.

nVidia will do whatever brings them the most bundles of our cash... if at one point calling a card 1180 brings them the most, it'll happen...

You just broke OP's heart with that OEM news.
 
The RTX "premium" is really a matter of optics (public perception) and market segmentation (what Nvidia decides to do).

The RTX 2060 has 1920 shaders for $350 MSRP. The 1660ti has 1536 shaders for $280 MSRP. So the RTX 2060 has 25% more shaders for 25% more price wise, and better yet they are throwing in a game bundle for the 2060 but not the 1660ti. So in effect you're actually paying a premium to not have RTX.

See how easily perception affects everything? They had a poor reception of selling RTX as a premium feature but judging by the more positive 1660ti response they instead succeeded in selling non RTX as a "premium" feature instead.
 
You pretty much made up the rumor an 1180 card would launch this summer, so the whole thread is illogical ;)


Shut up, I got this info when I did a google search so this rumor started way before me and I never said for sure it's going to happen. I don't care if the 1180 comes out or not I already pulled the trigger on a RTX 2060 and I'm going to wait for 7nm cards at this point.
 
Last edited:
"There WON"T be any 1100-series anything cards"

That's your opinion, you never know we will see this summer. 1600 series is bullshit ? How is the 1600 series bullshit, it's the true successor to the 1060 gpu. It's a mainstream card that 1080p users are waiting for, it's basically a 1070 for 1060 price. You do know that 1080p is still by far the most popular resolution right ?
Oh yeah, I posted that before I've seen and read the myriad of benchmarks, and the 1660Ti and the 1660 don't look all that bad. Good clockspeeds and good performance, in general, so yeah, you can disregard that part of my post :D
PS. The Ti version looks pretty beastly, but I wish they'd drop the price a bit, maybe two-fiddy or lower.
 
No new top GPU models from Nvidia this year. There is no pressure from AMD to make this viable. They will release smaller models from 20xx/16xx range.
 
Before the RTX 2080 release, rumors were that it would be called an 1180. Thats where the web references to the 1180 come from. Now its clear that there will never be a 1180.
 
Honestly people who want to upgrade should be looking at a used 1080ti over any 11xx cards. If they do release a 1180 it will most definitely be still slower then a 1080ti. A used 1080ti is a steal at around $500 atm.
 
I dont see why nvidia would make a new high tier card without ray tracing that could potentially ruin there RTX plans for the future... but then again it truly would not surprise me....
 
I dont see why nvidia would make a new high tier card without ray tracing that could potentially ruin there RTX plans for the future... but then again it truly would not surprise me....

If NVidia is utterly humbled and broken on RTX and does a complete retreat from RTX, then there would be high cards without it, but that is unlikely in the extreme, and wouldn't happen in this generation or even in the next, it would likely be at least two failing RTX generation before they would throw in the towel.

So while it is in the realm of possibility that there eventually might a reversal, it certainly will never happen on generation 1xxx cards, and I would bet on it never happening on any generation. IMO real time ray tracing in gaming cards is here to stay on the NVidia side and AMD and Intel will join in soon enough.
 
Last edited:
There will be a 7nm shrink in Nvidia's future. When that is really depends on Samsung. It could be out later this year, depending on how quickly they can ramp-up large dies.
.

Why wouldn't NVidia use TSMC as their major partner like they have done for over a decade?
 
I feel like DX9 was such a break-though at the time, while the newer APIs have been incremental and not offered a substantial difference.

I mean, some games today support DX11 and DX12 but they look exactly the same. There is no visible benefit, and sometimes performance is even slower. Kind of a hard sell.

At least with DOOM and Wolfenstein using Vulkan there are tangible performance gains, especially for older hardware. This is what was promised but hasn't largely materialized.

Maybe ray-tracing will be the jump we are looking for. At least the graphics actually *look* better, there is no denying that (performance concerns aside).
If you have a number of draw calls of say 40K it will look the same on most API unless you are going to change the game engine to do more you are not going to find much difference between them. If you own an AMD card you download the oxide oxygen demo using Mantle where it keeps pushing more data through as much as your hardware can handle.

The reason why you get a benefit from certain hardware on API is when you struggle with a few of the normal problems. One of those problems is not having single core performance (instructions per cycle). Running Mantle (proxy DX12 Vulkan) on Battlefield 4 with Phenom II cpu would alleviate the problem when you have a good graphics card.

The difference is there just not used.
 
Why wouldn't NVidia use TSMC as their major partner like they have done for over a decade?

Because they split duties with TSMC (large dies) and Samsung (small dies) for Pascal, and every rumor site on the web is saying they decided to go Samsung for new EUV process tech (or maybe switch the large dies to Samsung, and use TSMC for smaller stuff).

t guess we will see what actually happens.

https://www.extremetech.com/computing/283241-nvidia-may-tap-samsung-for-7nm-ampere-arrives-in-2020

https://www.tomshardware.com/reviews/nvidia-samsung-7nm-euv-2020-graphics-cards,5956.html

https://www.pcgamesn.com/nvidia-7nm-gpu-tsmc
 
Last edited:
I just got a 2070 as I got tired of waiting.

And if Navi comes out with something more interesting in the summer/fall I should still be able to get a few hundred for it should I decide to get something else.
 
Back
Top