Is buying an Nvidia GPU today a bit foolish, if Volta is (said to be) coming soon?

oblongpolygon

Weaksauce
Joined
Sep 4, 2017
Messages
119
Sooo... is buying an Nvidia GPU, like right now, a bit foolish, given that rumors suggest Volta is is right around the corner?

I understand there's no right or wrong answer, here... I guess I'm just interested in the thoughts of members.

Personally, I'm torn between the blower and open-air style coolers... which has paralyzed my ability to make a choice. And that caused me to think...

Maybe I should just hold off, for a bit, and see what becomes available in the next few months!

After all, in the meantime / interim, the integrated graphics of a recent CPU, like the 8700, would surely handle typical desktop tasks, like browsing, without trouble.

And I think integrated graphics could even handle other tasks, like CAD too!
 
Last edited:
Pascal is at end of life, from a marketing/selling perspective. Whatever you buy now is going to be superseded relatively soon by the next gen unless you buy a 1080 Ti but that is highly dependent on which cards nVidia decides to launch first.

Since you're not precise in regard to which Intel chip's IGP you're talking about, is it safe to presume that you're looking to build a full system?
 
I am hoping Nvidia releases the high end first so supplies have a chance of being available while they ramp up for the rest of the cards. The old fashion way. Seems that was the case with the Titan V as a after thought.
 
I'm waiting on something reasonable as far as upgrade....let the gpu and mining war blow over...
 
If I had to put money on it I'd say June is when you will be able to buy an AIB 2080. And it will still be extremely hard to get your hands on one.
 
if it is anything like in norway they cost alot more the 1080ti then launch even... it wouldnt suprise me if u can get atleast FE for less then a 1080ti. esp. now as there are so few cards left and no more cards are coming. in norway i didnt have any issue buying 1080/1080ti from retailer, komplett. they tend to get quite a few cards but, let's see not much is known of performance yet.
 
At current prices? hell no. But I don't think it will get any better with volta.
 
If you need/want one now, get one, it's like not a 10 series GPU is going to be obsolete overnight, it will still give you several years of life.
 
1080p yes -> 1440p for most part yes, some few games no 4k on the edge already, alot of games wont run max, depending on how much u like fps or not xD 980ti was equal atleast to 1070 and close to 1080, so close to not worth upgrade.. i guess it depend on your need? i would wait if i can or if i get a 1080ti for good price u can always sell it later without a too big loss i hope? let's just hope they wont sell for 1100-12000 usd at launch these cards 2080/ti :|
 
I'm super looking forward to actually trying to get my hands on one without selling a kidney /s :p
 
So what do you guys think of pricing regarding to when Volta appears, will the old cards drop significantly or? I just don't see it how Nvidia could just price the new cards above the current levels, it's just too much but yea I don't know, feels like anything is possible in today's GPU market. I'm only waiting to be able to upgrade my GTX 970 as Kingdom Come Deliverance is pretty demanding (reviews suggest around 30 - 60 fps is expected on a GTX 970 at high quality settings). Granted I'm a 1080p user (144Hz) but yea.

I don't want to pay above 400€ for a graphics card irrelevant of my economy, for me it always felt nuts to pay a lot for a highend GPU in relation to what just about any hardware you get and what you pay for it (highend phones 700~$1000 at launch is a bit comparable though).
 
So what do you guys think of pricing regarding to when Volta appears, will the old cards drop significantly or? I just don't see it how Nvidia could just price the new cards above the current levels, it's just too much but yea I don't know, feels like anything is possible in today's GPU market. I'm only waiting to be able to upgrade my GTX 970 as Kingdom Come Deliverance is pretty demanding (reviews suggest around 30 - 60 fps is expected on a GTX 970 at high quality settings). Granted I'm a 1080p user (144Hz) but yea.

I don't want to pay above 400€ for a graphics card irrelevant of my economy, for me it always felt nuts to pay a lot for a highend GPU in relation to what just about any hardware you get and what you pay for it (highend phones at launch is a bit comparable though).

I'd expect the 1180/2080 to be 700-800 at launch like the 1080 was.
 
I'd expect the 1180/2080 to be 700-800 at launch like the 1080 was.

Yea that sounds about right to me. Then there's a chance I might be able to pick up some GTX 1070 Ti that's still 2x faster than a GTX 970 which is more of an upgrade than I'm used to at a time (the price made me hold on much longer this time around).
 
I'm afraid that even when Volta launches prices are still going to be what they are now or possibly worse. Nvidia came out the last few days and said that gpu prices will remain high and be even higher through Q3 2018. Lots of articles on it, https://www.dsogaming.com/news/nvid...ces-will-continue-increasing-through-q3-2018/, so even with new card launching, they most likely wont be MSRP because the retailers are loving the jacked up prices and with Nvidia coming out saying this, theres no reason for retail channels to lower prices or sell at near MSRP, everythings gonna keep getting scooped up immediately.
 
Is Volta going to be a HUGE leap forward? I mean huge. Like what we've haven't seen in at least 20 years?

If not you have to weigh the economics. If you're happy with what you have, keep it and wait. However, if it's like having long hot steel spikes slowly penetrating your eyeballs, then, yes, go ahead and buy. Buy instead of cry.
 
i think maxwell to pascal was on of the biggest increases iv'e vaguely remember that i read, in recent history or alltime. not really remember or sure, so i wouldnt take my own word on that lol. so if it is anything like maxwell -> pascal it will be prety nice anyway! atleast i got my 1080ti to possible sell of for upgrade to 2080ti
 
Since you're not precise in regard to which Intel chip's IGP you're talking about, is it safe to presume that you're looking to build a full system?

Well, the CPU which I've bought, the Intel 8700, has a UHD 630 in it.

So, if I don't buy a GPU, because Volta is around the corner (or for other reasons) then I have a UHD 630 for desktop tasks... which is 95% of my computing. Only 5% of uptime is gaming, for me... sadly?


I'm waiting on something reasonable as far as upgrade....let the gpu and mining war blow over...

I just feel like we could be waiting forever for a mining slowdown...


If I had to put money on it I'd say June is when you will be able to buy an AIB 2080. And it will still be extremely hard to get your hands on one.

Not long to wait for a Volta release... if a June release date is correct. So maybe I should hold-off from purchasing a GPU...


If you need/want one now, get one, it's like not a 10 series GPU is going to be obsolete overnight, it will still give you several years of life.

It just seems crazy for me to buy a dedicated GPU for the first time in about 15 years, when another huge leap in processing power could just be around the corner.

As said above, if the leap between Pacal to Volta is anything like the leap between Maxwell to Pascal, then there is another big leap coming.

(I'll try and find one / post of those graphs, showing exponential improvements in some feature which I don't understand.)
 
If you can stand to wait I’d wait. IGP isn’t great; I used the IGP in the rig in my sig for about a year before settling on the 1070 and while painful it was usable.
 
I used the IGP in the rig in my sig for about a year before settling on the 1070 and while painful it was usable.

When you say it was painful... do you mean during gaming?

Or was it painful during other tasks? Like, were you doing any CAD or modeling or similar?

I guess I don't really mind if I can't play games... but I do mind if I can't manage to do CAD work.
 
I don’t do CAD or 3D modeling so I wouldn’t know. But gaming was painful.

Since you’ve asked in other threads about modeling, GPUs are very good at shortening tasks in modeling (any vector-based math problem is a GPU dream task). Is there any particular reason your GPU has to be nVidia especially if gaming isn’t your sole consideration?
 
Looking at what's available on Newegg, I can't see anything that'd really be worth spending money on short of US$100- and that's basically a jump in performance tier.

So here's what I'd suggest: try using the Intel iGPU for your work. At worst it might be a little slow; then you can weigh whether it's worth getting something to hold you over.

And I definitely don't recommend buying something fast today with replacements incoming (your original question).
 
When you say it was painful... do you mean during gaming?

Or was it painful during other tasks? Like, were you doing any CAD or modeling or similar?

I guess I don't really mind if I can't play games... but I do mind if I can't manage to do CAD work.


How many polys are ya doing for your CAD models and how many models per scene? Are you view ports configured to see textures? Are you rendering?

If your doing just a few models, and like 5 million polys each a low end card is really enough, and 1050ti is plenty. If you have many objects, I mean a lot like 100 or so and all of them are geometry heavy, ya need a beefy system, CPU, ram, graphics card. Textures and geometry both eat up Vram and system ram, each vertex eats up 8 bytes of memory so geometry can eat up ram quickly. Rendering on a GPU, ya need a decent GPU regardless, if using iray or one of the GPU raytracing plugins.

If gaming is not something you are really interested in I would get a gen old quadro, those will come cheap (comparative to current quadros) and give all ya need and more for modelling. It will game decent too if need be.
 
Last edited:
Um... maybe 1 or 2 million, if I'm subdividing it, for detail.

But you're right, a 1050 ti should be plenty for most CAD / modeling - certainly for what I'm doing.

The issue then, is that I'm buying a low-to-mid range card which will easily be superseded in months, by the sounds of things: https://www.tweaktown.com/news/60839/nvidia-unveil-mining-specific-geforce-gtx-2080-cards/index.html

And so, I feel like, if we hang out for just a bit longer, we can buy once and get a better deal - either:
  • by buying a 1050 ti in a few months, and saving money, because a newly released range lowers the existing 1050 prices, or
  • by buying something from the new / improved range in a few months.
 
Last edited:
Nothing official has been announced yet, so the real availability date is unknown. Recent history is Vega, hey launch late 2016 - no Jan 2017, no Mar . . . ended up being July 2017. If your work is suffering??? Come on, buy the 1050Ti. Plus the Pro cards do cad so much better than gaming cards that Razors advice on a used Quadro is rather wise.
 
Um... maybe 1 or 2 million, if I'm subdividing it, for detail.

But you're right, a 1050 ti should be plenty for most CAD / modeling - certainly for what I'm doing.

The issue then, is that I'm buying a low-to-mid range card which will easily be superseded in months, by the sounds of things: https://www.tweaktown.com/news/60839/nvidia-unveil-mining-specific-geforce-gtx-2080-cards/index.html

And so, I feel like, if we hang out for just a bit longer, we can buy once and get a better deal - either:
  • by buying a 1050 ti in a few months, and saving money, because a newly released range lowers the existing 1050 prices, or
  • by buying something from the new / improved range in a few months.


I would wait for the next gen to come out and pre order when it does, its not that far away.
 
I would wait for the next gen to come out and pre order when it does, its not that far away.

yeah just wait...that card you'll be even more stuck at 1080p...needed a little more realistically for 2k. and like wise need some more for 4k.
 
At current prices? hell no. But I don't think it will get any better with volta.
This,

These new cards will get gobbled up just as fast as the pascal cards. Its always been hard enough to get cards on release and prices skyrocket as a result. Now with miners in the game this will only get even worse. I don't see the prices of pascal cards dropping much either. Sure you'll have miners trying to off load them, creating an influx in supply, but then you'll have everyone looking to buy them because they couldn't before. I see USED pascal cards going for MSRP once people start clawing for them.
 
This,

These new cards will get gobbled up just as fast as the pascal cards. Its always been hard enough to get cards on release and prices skyrocket as a result. Now with miners in the game this will only get even worse. I don't see the prices of pascal cards dropping much either. Sure you'll have miners trying to off load them, creating an influx in supply, but then you'll have everyone looking to buy them because they couldn't before. I see USED pascal cards going for MSRP once people start clawing for them.

I've seen used pascal cards going for a lot more than MSRP. Just take a look at b-stock on EVGA.
 
I can't wait to see the shitshow that launch availability and pricing will be. Will be like trying to find a pretty whore when 3 carrier groups are parked at port.
 
i think maxwell to pascal was on of the biggest increases iv'e vaguely remember that i read, in recent history or alltime. not really remember or sure, so i wouldnt take my own word on that lol. so if it is anything like maxwell -> pascal it will be prety nice anyway! atleast i got my 1080ti to possible sell of for upgrade to 2080ti

It won't be. Pascal had the advantage of a die shrink and a second-gen efficiency bump (after Maxwell). The efficiency improvements for Volta/Ampere/whatever it's called will be even smaller than Maxwell.

Maxwell was the single biggest efficiency bump we're ever seen. Power reduced by 70% on the same process node was no joke.

THE BIGGEST performance increase EVER was the 7900 GTX to 8800 GTX. That was DOUBLE performance, on the same process node. But it came at double the power consumption, and massive die size.
 
I can't wait to see the shitshow that launch availability and pricing will be. Will be like trying to find a pretty whore when 3 carrier groups are parked at port.

I imagine Nvidia will have decent amount of availability on their website that it wont sale out within the first 5 minutes. Pricing will be a different, I expect it to be 20-30 more than Pascal pricing.
 
I imagine Nvidia will have decent amount of availability on their website that it wont sale out within the first 5 minutes. Pricing will be a different, I expect it to be 20-30 more than Pascal pricing.
Yeah, I would like to think so, since the 10XX from nV and RX 4xx/5xx and RX Vegas have been discontinued since last November, I would like to think that both companies have been SUPER-busy minting the GPUs and filling up their warehouses with them; or sending them to the graphics card manufacturers to stock the hell up on them, so it wouldn't be the shitty disaster that the launches of last year have been (mostly pointing at AMD).
 
It won't be. Pascal had the advantage of a die shrink and a second-gen efficiency bump (after Maxwell). The efficiency improvements for Volta/Ampere/whatever it's called will be even smaller than Maxwell.

Maxwell was the single biggest efficiency bump we're ever seen. Power reduced by 70% on the same process node was no joke.

THE BIGGEST performance increase EVER was the 7900 GTX to 8800 GTX. That was DOUBLE performance, on the same process node. But it came at double the power consumption, and massive die size.

The 8800GTX was a viable gaming card for 3-4 years after release. Probably the best money I ever spent on a video card.
 
Yeah, I would like to think so, since the 10XX from nV and RX 4xx/5xx and RX Vegas have been discontinued since last November, I would like to think that both companies have been SUPER-busy minting the GPUs and filling up their warehouses with them; or sending them to the graphics card manufacturers to stock the hell up on them, so it wouldn't be the shitty disaster that the launches of last year have been (mostly pointing at AMD).

I imagine both Nvidia and AMD will be filling their warehouse of their next gen gpu months ahead of release, but I don't recalled RX Vegas being discontinued, but I do recalled AMD having production issues due to HBM availability.
 
It won't be. Pascal had the advantage of a die shrink and a second-gen efficiency bump (after Maxwell). The efficiency improvements for Volta/Ampere/whatever it's called will be even smaller than Maxwell.

Maxwell was the single biggest efficiency bump we're ever seen. Power reduced by 70% on the same process node was no joke.

THE BIGGEST performance increase EVER was the 7900 GTX to 8800 GTX. That was DOUBLE performance, on the same process node. But it came at double the power consumption, and massive die size.


We already saw this with compute performance in perf/watt on Volta actually can be greater so average is around 70%, with a 800mm2 die ;) I expect at least 70% perf/watt improvement with Volta/Turing/Ampere what ever its called in gaming.
 
I'm super looking forward to actually trying to get my hands on one without selling a kidney /s :p

Wait, there's a market for kidneys? You only need one, right?

I just feel like we could be waiting forever for a mining slowdown...

Don't forget the other end. NVidia's probably got a lot better idea than we do of what demand is going to look like in the near term. Increasing capacity is no small risk, but it has to be balanced against the risk of being too cautious. At some point, they will increase capacity, if they think current demand is the new normal.
 
Last edited:
Back
Top