Volta GDDR6 release in May?

My 980Ti just needs to keep truckin' along for a little while longer...Volta or Vega will be my next GPU; whichever one will provide over double the performance at a minimum, lower power draw, and less heat output in the < $600 range I'm sticking to.
 
Ok, just to be clear, the 1080 is considered "midrange" now? Just trying to get things aligned, since AMD is so far behind now it's difficult to try and map what is supposed to be midrange between the product stacks.

Nah. Would say product stack looks like this:

Low-end enthusiast: 1050 Ti
Mid-low: 1060
Mid-high: 1070
High: 1080
Halo: 1080 Ti

With Titan Xp being the icing on that halo.
 
1050/1060 is mainstream
1070/1080 is performance
1080ti is enthusiast
Titan Xp is halo

That's how I would rate it.
 
It's high mid-range in the actual product stack:

1080 -> Titan X -> 1080 Ti -> Titan Xp

The 1080 has three other cards above it in the product stack, all of which beat the 1080 like a red-headed stepchild.
 
unlike Intel, nV can't do that, if they do, they loose sales, there is no revolving door for GPU's. IT departments for computers, usually have IT protocols for replacing PC's over a certain period of time, so for Intel, they will get a revolving door from businesses. GPU's on the other hand, is not a distributed product like computer systems, they are add in products, so a person will not be upgrading GPU's like they would be doing their systems unless there is a need to.

Unless nV innovates and makes a product (saturated market, a market in which the product creates demand) that gives reason for upgrade, will be the only way they will be able to keep sales going.

I dont totally disagree with you, the problem I see for Nvidia is giving a guy a reason to even want to upgrade at this point. Unless your doing VR or 4K a current 1080 or above owner is going to have little reason to care about Volta or Vega. Most people are at 1440 or less so I see a massive drop in people wanting or needing to upgrade. I mean if your hitting above 60 fps and closer to a constant 100 fps your not upgrading no matter how much bling either company slaps on it. As we can see by Nvidia financials they are heavily dependent on video card sales and that is not good for them, even AMD will face this issue as well they just have the cpu business to help them tho. Mobile is about the last base left that needs better cards but I am not sure how big of a market that is. Games just are not driving cards into sub par performance as easily anymore so I see a issue coming for both companies.
 
I dont totally disagree with you, the problem I see for Nvidia is giving a guy a reason to even want to upgrade at this point. Unless your doing VR or 4K a current 1080 or above owner is going to have little reason to care about Volta or Vega. Most people are at 1440 or less so I see a massive drop in people wanting or needing to upgrade. I mean if your hitting above 60 fps and closer to a constant 100 fps your not upgrading no matter how much bling either company slaps on it. As we can see by Nvidia financials they are heavily dependent on video card sales and that is not good for them, even AMD will face this issue as well they just have the cpu business to help them tho. Mobile is about the last base left that needs better cards but I am not sure how big of a market that is. Games just are not driving cards into sub par performance as easily anymore so I see a issue coming for both companies.

And why do you think performance tier is now bigger than the mid range tier? The lion's share of monitors is at 1080p, but performance cards are much too much for 1080p? This is the mind set of AIB buyers and OEM's, they know where they money is for them and they push those products because its better for their bottom line.

Going by your logic, anyone with a 970 or 980, they will not upgrade to Pascal. yet, they did lol.

Although I agree with your assessment games are not driving cards into sub par performance for the default settings, but the drive to get better hardware is still there because some features in those games can push those cards too. And with new monitor technologies and higher resolutions, those performance cards are being pushed.

This is not 10 years ago where the mid range was the bread and butter of the industry, chip complexity and node complexity has increased costs and this in turn pushed nV and OEM's and AIB partners to restructure pricing and marketing tactics to better suit the goals of staying healthy. AMD still hasn't gotten around to this for what ever reason.
 
And why do you think performance tier is now bigger than the mid range tier? The lion's share of monitors is at 1080p, but performance cards are much too much for 1080p? This is the mind set of AIB buyers and OEM's, they know where they money is for them and they push those products because its better for their bottom line.

Going by your logic, anyone with a 970 or 980, they will not upgrade to Pascal. yet, they did lol.

Although I agree with your assessment games are not driving cards into sub par performance for the default settings, but the drive to get better hardware is still there because some features in those games can push those cards too. And with new monitor technologies and higher resolutions, those performance cards are being pushed.

This is not 10 years ago where the mid range was the bread and butter of the industry, chip complexity and node complexity has increased costs and this in turn pushed nV and OEM's and AIB partners to restructure pricing and marketing tactics to better suit the goals of staying healthy. AMD still hasn't gotten around to this for what ever reason.

People bought 1080's cause they performed great, no doubt. But people dont buy monitors very often and its one of the reasons 1080p is the popular resolution. Why would I get less then a 1080 and have all the performance I need now and likely for years to come. A few people did upgrade from a 970 and 980 but quite a few still have them as well. There will always be a couple people that have to have the best cause benchmarks. There is always some stupid setting that kills performance for no reason and adds nothing to the experience but no one really upgrades cause they need to see hair at max tessellation. Nothing is pushing a 1080 card at 1080p and 1440p resolution that I have seen. Will see how Volta sells but I have a feeling it will be the beginning of continuing slides in sales unless something shakes the market up and creates a massive need again. Will see what happens tho.
 
People bought 1080's cause they performed great, no doubt. But people dont buy monitors very often and its one of the reasons 1080p is the popular resolution. Why would I get less then a 1080 and have all the performance I need now and likely for years to come. A few people did upgrade from a 970 and 980 but quite a few still have them as well. There will always be a couple people that have to have the best cause benchmarks. There is always some stupid setting that kills performance for no reason and adds nothing to the experience but no one really upgrades cause they need to see hair at max tessellation. Nothing is pushing a 1080 card at 1080p and 1440p resolution that I have seen. Will see how Volta sells but I have a feeling it will be the beginning of continuing slides in sales unless something shakes the market up and creates a massive need again. Will see what happens tho.

Games aren't evolving? Games are pushing these cards when using higher than default settings, look at Gears of War 4, that is still a game that came out what 6 months ago?

Ultra settings, and you can still push it more if you like, ultra isn't the highest.

http://www.guru3d.com/articles_pages/msi_gtx_1080_ti_gaming_x_review,15.html

Deus EX?

http://www.guru3d.com/articles-pages/msi-gtx-1080-ti-gaming-x-review,17.html

Sniper Elite 4

http://www.guru3d.com/articles_pages/msi_gtx_1080_ti_gaming_x_review,12.html

None of these games are using the highest possible settings, Ultra is not the highest they can go, yet they are pushing these cards down to around 60 FPS without the highest settings possible @2k which we know most people only have 1080p

This is the same logic as saying Ryzen's gaming performance is enough, NO its not, Intel's IPC for current games is JUST more and there is nothing AMD can do about it. It might be "enough" perceived performance for you, but that is not how the market works nor how most consumers see products when they buy, they buy based on the best performance/features possible for a price they are comfortable with. So if nV doesn't have new products for their generation cycle times, they won't get more sales, they will drop sales, because outside of OEM's people that are DIY are not going to upgrade and that will just hurt total sales volume and play into AMD's hands when marketshare numbers come out. nV isn't giving, Intel nor Google companies that have how much more resources then they do no room to maneuver, you think they will give AMD any chance?

nV is selling to its own customers and making AMD loyalists switch over, it has nothing to do with people have enough graphics card performance, people want these cards and are purchasing because they either want the extra performance or need the extra performance or perceive they will need the performance at a latter date.

And then you start looking at professionals that need these cards, for 2d 3d work, you think they won't upgrade either? just to give you an example an baking an 8k texture set with x8 AA takes 3 to 4 hours on Titan Xp, and while its doing that, I can't really do anything with my system, as my CPU and ram is all used up. Shit would I love to cut down that time to like 2 hours!
 
Last edited:
Games aren't evolving? Games are pushing these cards when using higher than default settings, look at Gears of War 4, that is still a game that came out what 6 months ago?

Ultra settings, and you can still push it more if you like, ultra isn't the highest.

http://www.guru3d.com/articles_pages/msi_gtx_1080_ti_gaming_x_review,15.html

Deus EX?

http://www.guru3d.com/articles-pages/msi-gtx-1080-ti-gaming-x-review,17.html

Sniper Elite 4

http://www.guru3d.com/articles_pages/msi_gtx_1080_ti_gaming_x_review,12.html

None of these games are using the highest possible settings, Ultra is not the highest they can go, yet they are pushing these cards down to around 60 FPS without the highest settings possible @2k which we know most people only have 1080p

This is the same logic as saying Ryzen's gaming performance is enough, NO its not, Intel's IPC for current games is JUST more and there is nothing AMD can do about it. It might be "enough" perceived performance for you, but that is not how the market works nor how most consumers see products when they buy, they buy based on the best performance/features possible for a price they are comfortable with. So if nV doesn't have new products for their generation cycle times, they won't get more sales, they will drop sales, because outside of OEM's people that are DIY are not going to upgrade and that will just hurt total sales volume and play into AMD's hands when marketshare numbers come out. nV isn't giving, Intel nor Google companies that have how much more resources then they do no room to maneuver, you think they will give AMD any chance?

nV is selling to its own customers and making AMD loyalists switch over, it has nothing to do with people have enough graphics card performance, people want these cards and are purchasing because they either want the extra performance or need the extra performance.

Did you not read what I wrote man, I said only 4k and VR has a need and then you show me graphs where games choke down at 4k.. no shit I just said that. 1440 and down is where most everyone is and its around 100 fps or more. Now you want to go on a rant about Ryzen which was never mentioned, try to stay focused on a subject. The well is running dry and if you dont agree that is fine but try to stay on subject for a change.
 
Games aren't evolving? Games are pushing these cards when using higher than default settings, look at Gears of War 4, that is still a game that came out what 6 months ago?

Ultra settings, and you can still push it more if you like, ultra isn't the highest.

http://www.guru3d.com/articles_pages/msi_gtx_1080_ti_gaming_x_review,15.html

Deus EX?

http://www.guru3d.com/articles-pages/msi-gtx-1080-ti-gaming-x-review,17.html

Sniper Elite 4

http://www.guru3d.com/articles_pages/msi_gtx_1080_ti_gaming_x_review,12.html

None of these games are using the highest possible settings, Ultra is not the highest they can go, yet they are pushing these cards down to around 60 FPS without the highest settings possible @2k which we know most people only have 1080p

This is the same logic as saying Ryzen's gaming performance is enough, NO its not, Intel's IPC for current games is JUST more and there is nothing AMD can do about it. It might be "enough" perceived performance for you, but that is not how the market works nor how most consumers see products when they buy, they buy based on the best performance/features possible for a price they are comfortable with. So if nV doesn't have new products for their generation cycle times, they won't get more sales, they will drop sales, because outside of OEM's people that are DIY are not going to upgrade and that will just hurt total sales volume and play into AMD's hands when marketshare numbers come out. nV isn't giving, Intel nor Google companies that have how much more resources then they do no room to maneuver, you think they will give AMD any chance?

nV is selling to its own customers and making AMD loyalists switch over, it has nothing to do with people have enough graphics card performance, people want these cards and are purchasing because they either want the extra performance or need the extra performance or perceive they will need the performance at a latter date.

And then you start looking at professionals that need these cards, for 2d 3d work, you think they won't upgrade either? just to give you an example an baking an 8k texture set with x8 AA takes 3 to 4 hours on Titan Xp, and while its doing that, I can't really do anything with my system, as my CPU and ram is all used up. Shit would I love to cut down that time to like 2 hours!

Games these days tend to incorporate the expensive shaders and effects from the Ultra settings of games of a few years ago into their baseline configuration. I am noticing more and more often that the differences between minimum and higher settings are becoming less and less pronounced. Usually relegated to more expensive forms of AO and dynamic shadows for more objects in the environment etc. Whereas a few years ago the difference between low and high was night and day, possibly console influence ?

I keep hearing this argument of 'nothing pushes card X at resolution Y, the only settings that really kill performance are stupid things nobody notices anyway' etc etc, but that's just because the closer you get to photo-realistic realtime graphics the more subtle the improvements will be, you may perceive that the IQ is good but not quite be able to put your finger on why. The Division included an option for HFTS which probably falls into the category of expensive shaders he considers useless. The improvements are subtle, but the computation required to perform these lighting calculations in real time is massive. That's just how it is.
 
Did you not read what I wrote man, I said only 4k and VR has a need and then you show me graphs where games choke down at 4k.. no shit I just said that. 1440 and down is where most everyone is and its around 100 fps or more. Now you want to go on a rant about Ryzen which was never mentioned, try to stay focused on a subject. The well is running dry and if you dont agree that is fine but try to stay on subject for a change.


The games I just listed 2k those games NOT at their highest possible settings are running around 60 FPS lol, I did read what you wrote, and I just showed you they are being pushed by the newer games.

I mentioned Ryzen because you are using the same argument as you and other did about Ryzen's gaming performance and how how its "enough". No there is no such thing as enough in the eyes of a consumer, because its an ever evolving dynamic landscape as long as the price brackets stay the same and tech improves.

What kind of graphics cards have you purchased recently in the past 5 years. List them and then I will compare then with games around the time those cards came out and tell you if your purchases were worth it based on the logic you just spatted out here and put game performance charts up too and you know what we will see. Your logic will not hold up if you upgraded graphics cards every 2.5 years or so.
 
Games these days tend to incorporate the expensive shaders and effects from the Ultra settings of games of a few years ago into their baseline configuration. I am noticing more and more often that the differences between minimum and higher settings are becoming less and less pronounced. Usually relegated to more expensive forms of AO and dynamic shadows for more objects in the environment etc. Whereas a few years ago the difference between low and high was night and day, possibly console influence ?

I keep hearing this argument of 'nothing pushes card X at resolution Y, the only settings that really kill performance are stupid things nobody notices anyway' etc etc, but that's just because the closer you get to photo-realistic realtime graphics the more subtle the improvements will be, you may perceive that the IQ is good but not quite be able to put your finger on why. The Division included an option for HFTS which probably falls into the category of expensive shaders he considers useless. The improvements are subtle, but the computation required to perform these lighting calculations in real time is massive. That's just how it is.


Very true, our eyes and perception to think something is real is based on years of seeing the real world and to duplicate the nuances of that in a game is extremely expensive. I think people underestimate the amount of data our eyes and brains can conceive when making their arguments because they don't think about those things.
 
Did you not read what I wrote man, I said only 4k and VR has a need and then you show me graphs where games choke down at 4k.. no shit I just said that. 1440 and down is where most everyone is and its around 100 fps or more. Now you want to go on a rant about Ryzen which was never mentioned, try to stay focused on a subject. The well is running dry and if you dont agree that is fine but try to stay on subject for a change.
upload_2017-5-11_19-45-24.png
Fancy new 1080 with 11gbps memory as well. Shit is demanding.

Very true, our eyes and perception to think something is real is based on years of seeing the real world and to duplicate the nuances of that in a game is extremely expensive. I think people underestimate the amount of data our eyes and brains can conceive when making their arguments because they don't think about those things.

Yeah I agree completely and that's one of the reasons I was really blown away by Obduction running on UE4 because man I have never seen more realistic surfaces before and it really just hits you how unrealistic they are in others games lol

Obduction_Win64_Shipping_2016_08_27_02_25_36_367.png
 
I mentioned Ryzen because you are using the same argument as you and other did about Ryzen's gaming performance and how how its "enough". No there is no such thing as enough in the eyes of a consumer, because its an ever evolving dynamic landscape as long as the price brackets stay the same and tech improves.

There's truth to that. Nothing is ever enough. But, that being said, an "enough" argument can be made if there are tangible benefits to offset. The usual argument for Ryzen is gaming performance is "enough" if you see a benefit from the extra cores somewhere else. In other words, just prioritizing your needs against your budget.

That kind of thing is somewhat less relevant in the GPU space, because it's more specialized. Although, not completely irrelevant either. Back in the day, I bought a stable of Radeons because they were far better at crypto mining at the time, even if slightly slower than equivalent Nvidia cards in gaming. But, a niche case.

Point being, the argument works if there's an offset/tradeoff somewhere. It doesn't work if there isn't.
 
yeah I agree with that, and that is what Gideon is not mentioned or forgetting, the consumers needs and wants are part of the equation too.
 
I don't think a volta consumer level will be out before end of 2017 unless AMD makes a significant push with their hardware. As it stands, they are looking to push cores for the professional/server end of things first by Q3 2017, so your looking at nv's Feb launch window for consumer level parts. I'm probably sticking with my 1080 until I can do a good 4k upgrade afford + some 1180 part....
 
I don't think a volta consumer level will be out before end of 2017 unless AMD makes a significant push with their hardware. As it stands, they are looking to push cores for the professional/server end of things first by Q3 2017, so your looking at nv's Feb launch window for consumer level parts. I'm probably sticking with my 1080 until I can do a good 4k upgrade afford + some 1180 part....

There is a single reason why Volta will be out this year for consumers, no matter what people then else may think of this and that. And that's to keep up the revenue momentum from upgrades.

Remember, GTX1080 being announced a month after GP100 with shipping 3 weeks later wasn't possible either to a lot of people.
 
I'm going with my ex coworkers tell me, but who knows at this point what can change later. My main focus these days are the prosumer/workstation parts right now, while the home part is a 2nd aspect. From the talks here at GTC and the info I am getting, they are really working to push the non consumer parts out despite consumer level providing the revenue stream.
 
I got a 1080ti here in forums for like 575 and then I threw on the hybrid kit. Now I am rocking at 2ghz+ all day long. Actually 2050-2038. Mostly 2059 and memory at 6000, 528gb/sec. I really don't know if I need anything faster at this point even for a few years. Lol
 
I got a 1080ti here in forums for like 575 and then I threw on the hybrid kit. Now I am rocking at 2ghz+ all day long. Actually 2050-2038. Mostly 2059 and memory at 6000, 528gb/sec. I really don't know if I need anything faster at this point even for a few years. Lol

"Need" is a very strong word. Technically I don't "need" anything more than the old Q6600 box that's sitting in my MAME cabinet with a hobbled 7950 with a bad fan bearing in it. But we all know how that kind of thing ends up...
 
"Need" is a very strong word. Technically I don't "need" anything more than the old Q6600 box that's sitting in my MAME cabinet with a hobbled 7950 with a bad fan bearing in it. But we all know how that kind of thing ends up...
I don't need 1080ti but I got it. You are right. But this is the fastest card I have ever. Bought. I won't waste any money on Volta unless games get super demanding at 1440p and Volta blows 1080ti out of water. It would have to be 70-80% faster then my OC 1080ti for it to throw my needs away and opening my wallet for just for the hell of it.
 
I don't think a volta consumer level will be out before end of 2017 unless AMD makes a significant push with their hardware. As it stands, they are looking to push cores for the professional/server end of things first by Q3 2017, so your looking at nv's Feb launch window for consumer level parts. I'm probably sticking with my 1080 until I can do a good 4k upgrade afford + some 1180 part....

after all the professional side is where the money is while the consumer side is where the volume is.. my feeling though is Nvidia didn't see much of a professional market adoption of the Pascal based cards and are hoping Volta changes that. either way it's nothing new, both sides have been doing release's this way for a long time.. usually it's no more than a few months between when the professional side is released to when the consumer side is released so Q4 is still and option, most likely around October or November.
 
after all the professional side is where the money is while the consumer side is where the volume is.. my feeling though is Nvidia didn't see much of a professional market adoption of the Pascal based cards and are hoping Volta changes that. either way it's nothing new, both sides have been doing release's this way for a long time.. usually it's no more than a few months between when the professional side is released to when the consumer side is released so Q4 is still and option, most likely around October or November.

The money is in gaming. After that the money is in AI/HPC and lastly comes the professional segment.

Q1 result:
Gaming 1027M$ up 49% YoY
Professional Visualization 205M$ up 8% YoY
Datacenter 409M$ up 186% YoY
Automotive 140M$ up 24% YoY
 
The money is in gaming. After that the money is in AI/HPC and lastly comes the professional segment.

Q1 result:
Gaming 1027M$ up 49% YoY
Professional Visualization 205M$ up 8% YoY
Datacenter 409M$ up 186% YoY
Automotive 140M$ up 24% YoY

Data center is the future, possibly even automotive and professional will stay pretty flat.
 
The money is in gaming. After that the money is in AI/HPC and lastly comes the professional segment.

Q1 result:
Gaming 1027M$ up 49% YoY
Professional Visualization 205M$ up 8% YoY
Datacenter 409M$ up 186% YoY
Automotive 140M$ up 24% YoY


Here's a quick chart over the years:

nvidia-q3-2016-groups.jpg
 
Data center is the future, possibly even automotive and professional will stay pretty flat.

Automotive is certainly also the future.

Cloud, automation/automotive and IoT is the growth places for the long term.
 
GTC seems to indicate a big embrace of the professional side for both volume and money. Every company here has some kind of large server rack/workstation setup ready for market. Even nvidia is getting in on the game with a DGX and VGX server setup. There is enough talk around here that consumer may be taking a back seat to professional.

We have medical, automotive, cloud services pushing forward here. This is far larger in scale than the consumer level domain.
 
GTC seems to indicate a big embrace of the professional side for both volume and money. Every company here has some kind of large server rack/workstation setup ready for market. Even nvidia is getting in on the game with a DGX and VGX server setup. There is enough talk around here that consumer may be taking a back seat to professional.

That's "Datacenter" segment. There is no growth and no money in the Professional Visualization to put it bluntly.
 
DGX has a workstation under the desk model. I have pictures. This is not targeting datacenters.
 
Data center is the future, possibly even automotive and professional will stay pretty flat.

Agree on the data center, though I do feel Automative will probably be the next growth for Nvidia as autonomous vehicles release increase in the future. I do agree that visualization will remain flat.
 
DGX has a workstation under the desk model. I have pictures. This is not targeting datacenters.

That's just the work of putting it into the cloud. But lets be honest here. There is no growth in it because there isn't a growing target segment. And the financials shows it.
 
Nvidia and Intel pretty much sits on everything automotive. And its a place with huge steady revenue in the very near future.

Intel got BMW.
Nvidia got Toyota, Audi, Volvo, and Tesla it seems.

Then there is a larger undecided bunch.
 
Last edited:
While I believe consumer will play a good role for them in future, the idea going around here is a push into all the other domains, ie pushing machine learning. Tensorcore is a reality of that drive right now. Open sourcing the TPU DLA also points to that. Creating the cloud propagation stack where you develop on prosumer workstation/server and easily migrate into cloud shows that. Look around at all the cloud services right now, all of them are incorporating GPU technology, which costs which make the consumer level cards paltry. MS Azure, Amazon Web cloud and even nvidia's push. They are committing to devolping, servicing and maintaining it it future.
 
Data centers are big thing now. Trust me it will be big by next decade and reason you are seeing both companies showing products initially for those. We are having, analytics, software defined networking. I mean you name it. Everything is turning in to data crunch.
 
Toyota selected nvidia drivepx for their autonomous car development. Toyota is top 2 car maker right now. nvidia is proliferating into multiple domains.

I wonder how much that will impact their bottom line....Last year (by my quick Googling), Toyota sold 2.5 mil cars...if EVERY one of those was autonomous, and every one had a $1000 Nvidia GV100, that's $2.5 billion, or about 50% of their revenue. But that's assuming that every car is autonomous, which is a big stretch right now.

I think automotive will be a good chunk of change for them, but I think it'll be a while before it tops gaming, where they made 2.8 billion.
 
It takes a little while yet. As in 2020 the cars start rolling out. But I would guess Nvidia automotive revenue in 2020 had passed 500M$ a quarter.

Toyota makes 7.2M cars a year.
VW 6.1M cars a year.
Ford 5.8M
Honda 4.6M
Nissan 4.5M

And so on. 84M cars a year.
 
It takes a little while yet. As in 2020 the cars start rolling out. But I would guess Nvidia automotive revenue in 2020 had passed 500M$ a quarter.

Toyota makes 7.2M cars a year.
VW 6.1M cars a year.
Ford 5.8M
Honda 4.6M
Nissan 4.5M

And so on. 84M cars a year.

I think a LOT depends on government regulations. I think the tech is probably ready, but you need to make the people and the government ready. There's a whole lot of unanswered questions about autonomous cars that need to get answered before they start rolling off the line.
 
Back
Top