Vega Rumors

so a 1080 - in the same comparison uses 43 cents in 24 hours at full tilt (or 1.8 cents per hour)
as mentioned earlier 300 watt Vega will use 72 cents of electricity in 24 hours of full tilt

If you game with two hours on each that's 6 cents for Vega, or 4 cents for 1080.

And your point of not using full TDP for gaming is fair -- which makes it even less for both cards -- if we use 66% TDP as a estimate then it's 4 cents for Vega for two hours of gaming and 2.5 cents for 1080 for two hours of gaming.

Still wholly immaterial -- and all the concern otherwise is folly.

It's not immaterial if the heat is forcing other component fans to run harder, is in a SFF factor, the end user wants to OC, or the turd is heating up a room. And where do you draw the line with efficient products? Or do you not, because based on your logic crap engineering only costs $.05 a day, so we should never really consider or reward efficiency unless it's the clearly better deal.
 
so a 1080 - in the same comparison uses 43 cents in 24 hours at full tilt (or 1.8 cents per hour)
as mentioned earlier 300 watt Vega will use 72 cents of electricity in 24 hours of full tilt

If you game with two hours on each that's 6 cents for Vega, or 4 cents for 1080.

And your point of not using full TDP for gaming is fair -- which makes it even less for both cards -- if we use 66% TDP as a estimate then it's 4 cents for Vega for two hours of gaming and 2.5 cents for 1080 for two hours of gaming.

Still wholly immaterial -- and all the concern otherwise is folly.

Inauthentically pretending that power efficiency is "wholly immaterial" is, as the internet likes to say, willfully ignorant. Cooling architecture, laptop-ability, SoC applications, etc. Not to mention the fact that some people don't want a mega space heater in their office fighting their AC unit.

Other applications are important for revenue. With these specs, there would be zero reason for a manufacturer or developer to choose Vega over a 1080.

But if you want to completely negate power efficiency as an issue; what, then, are you claiming the benefit of Vega is? That it's almost as fast as a product that has been available for many months? That it's an exact repeat of the previous generation of products?

I feel like I'm in a time warp. The same screaming about how great the last round of AMD cards were going to be before release, followed by crazy mind-twisting to try and justify faith once reality comes out and proves otherwise.
 
So Ryan talked to AMD and they stated the gaming drivers have all the optimizations that RX Vega will have just won't have the optimizations from this month till the launch of RX Vega.

So yeah these drivers are it he is also expecting a max of 10% increase based on different games, Not all the way around for RX Vega.
 
Inauthentically pretending that power efficiency is "wholly immaterial" is, as the internet likes to say, willfully ignorant. Cooling architecture, laptop-ability, SoC applications, etc. Not to mention the fact that some people don't want a mega space heater in their office fighting their AC unit.

Other applications are important for revenue. With these specs, there would be zero reason for a manufacturer or developer to choose Vega over a 1080.

These graphics cards we are actively talking about (vega, and 1080TI) in desktop form and at desktop power draw (250 for 1080TI and 1080 for 1080) are not making their way into laptops.

Ryzen + Vega APU for laptops is potentially on the horizon - and a potential force to reckon with.
 
These graphics cards we are talking about (vega, and 1080TI) in desktop form and at desktop power draw are not making their way into laptops.

Ryzen + Vega APU for laptops is potentially on the horizon - and a potential force to reckon with.


What are they going to do with the bandwidth bottleneck on the APU's they have right now?
 
These graphics cards we are talking about (vega, and 1080TI) in desktop form and at desktop power draw are not making their way into laptops.

Ryzen + Vega APU for laptops is potentially on the horizon - and a potential force to reckon with.

So you're under the (mis) impression that mobile versions of Nvidia/AMD are created de novo and have nothing to do with the underlying architecture? That's not true.

It's wholly material that their top-of-the-line product is hotter and slower than the competitor. They most certainly do find their way into laptop form factors.

You don't see any gaming laptops with AMD GPUs. For the same reason you won't see any this round. Again, the comparison would be a 1080 non-Ti (which does exist in laptop form factor already), vs. Vega. An extra 120 watts of TDP would massacre the entire setup. It's almost double the TDP of the original GPU in a sub-1" thick chassis.
 
Last edited:
Inauthentically pretending that power efficiency is "wholly immaterial" is, as the internet likes to say, willfully ignorant. Cooling architecture, laptop-ability, SoC applications, etc. Not to mention the fact that some people don't want a mega space heater in their office fighting their AC unit.

Other applications are important for revenue. With these specs, there would be zero reason for a manufacturer or developer to choose Vega over a 1080.

But if you want to completely negate power efficiency as an issue; what, then, are you claiming the benefit of Vega is? That it's almost as fast as a product that has been available for many months? That it's an exact repeat of the previous generation of products?

I feel like I'm in a time warp. The same screaming about how great the last round of AMD cards were going to be before release, followed by crazy mind-twisting to try and justify faith once reality comes out and proves otherwise.

I already mentioned a solid reason why someone might want an AMD card.

Freesync

Here's another one. Eyefininty multidisplay setups with mixed PLP (portrait landscape portrait) functionality for DX10, 11, and 12. Something Nvidia Surround can't do. Nvidia user's only option is Soft TH, which currently is DX9 at highest. But yes -- Freesync is the big one. I'd surely wager my bottom dollar that more Freesync monitors are out there in the wild than G-Sync, since Newegg has 5x's as many Freesync models for sale as they have G-sync monitors and G-sync monitors on average cost $2-300 more for the same panel display otherwise. So anyone interested in Freesync is eyeing an AMD card -- MS Console XBOX NEXT Scorpio introduces Freesync too - which means Freesync might start making it's way to TVs if it hasn't already. Nvidia can't serve Freesync. I've said for six or nine months that Freesync is AMD's ace in the hole.

As to these things being a "mega-space heater" in the office or at home. What is power draw at idle - because that's more typical to what your concern is, and to Magic Hate Ball's point -- the cards don't have to be stripped down hard on performance to be stripped down hard for power use when considering stuffing them into mobile application if that's the desired conversation. Wringing out the performance to the last rung is where the big power draw jump happens.
 
I already mentioned a solid reason why someone might want an AMD card.

Freesync

Here's another one. Eyefininty multidisplay setups with mixed PLP (portrait landscape portrait) functionality for DX10, 11, and 12. Something Nvidia Surround can't do. Nvidia user's only option is Soft TH, which currently is DX9 at highest. But yes -- Freesync is the big one. I'd surely wager my bottom dollar that more Freesync monitors are out there in the wild than G-Sync, since Nvidia has 5x's as many Freesync models for sale as they have g-sync monitors and Gsync monitors on average cost 2-300 bucks more for the same display otherwise. So anyone interested in freesync is eyeing an AMD card -- MS Console XBOX NEXT Scorpio introduces Freesync too - which means freesync might start making it's way to TVs if it hasn't already. Nvidia can't serve freesync. I've said for six or nine months that Freesync is AMD's ace in the hole.

As to these things being a "mega-space heater" What is power draw at idle - because that's more typical to what your concern is, and to Magic Hate Ball's point -- the cards don't have to be stripped down hard on performance to be stripped down hard for power use. Wringing out the performance to the max is where the big power draw jump happens.

Freesync is an open architecture and can be "served" by Nvidia. So far they haven't had a reason to since they've completely dominated AMD. Again, even laptops use G-Sync screens and Nvidia GPUs. There simply isn't a market share advantage to supporting Freesync.
 
AMD better price this competitive as hell. Unfortunately HBM2 does not seem to lend itself to cost cutting like Ryzen.
 
These graphics cards we are actively talking about (vega, and 1080TI) in desktop form and at desktop power draw (250 for 1080TI and 1080 for 1080) are not making their way into laptops.

Ryzen + Vega APU for laptops is potentially on the horizon - and a potential force to reckon with.

Say what? Laptops have full die 1070s and 1080s right now and the only difference is slightly reduced clocks. In fact, my Alienware 17 has a 1070 that often clocks to 2 GHz in games steady and sometimes drops down to 1900 MHz. That is right in the ballpark of desktop performance in a notebook and now NVIDIA has taken that a step further with Max Q by working with manufacturers like Asus to create notebooks that take power efficiency into mind to optimize weight and dimensions in conjunction with the video card being tailored for it. So you are completely clueless or willfully playing ignorant about this very important fact.
 
Freesync is an open architecture and can be "served" by Nvidia. So far they haven't had a reason to since they've completely dominated AMD. Again, even laptops use G-Sync screens and Nvidia GPUs. There simply isn't a market share advantage to supporting Freesync.
yes -- but if they don't start supporting Freesync -- it will bite them - IMO.

Antecdotal, but I read in forums pretty regularly that NVidia users are eyeing Vega because they have a freesync monitor and don't plan to upgrade their monitor soon. Some of my LAN party friends have admitted the same. One of my friends went from a 1070 to a Fury X just in the last month or so because he saw the value of freesync on my machine, and wanted to use the freesync on his display as well. G-Sync is a crumbling empire.
 
Say what? Laptops have full die 1070s and 1080s right now and the only difference is slightly reduced clocks. In fact, my Alienware 17 has a 1070 that often clocks to 2 GHz in games steady and sometimes drops down to 1900 MHz. That is right in the ballpark of desktop performance in a notebook and now NVIDIA has taken that a step further with Max Q by working with manufacturers like Asus to create notebooks that take power efficiency into mind to optimize weight and dimensions in conjunction with the video card being tailored for it. So you are completely clueless or willfully playing ignorant about this very important fact.

What's the TDP on the notebook 1070 and 1080 as compared to the desktop part?

Edit - Looked it up.

1070 mobile component - 120 watt
1080 mobile component - 165 watt

1070 desktop - 150 watt
1080 desktop - 180 watt
 
So you admit TDP isn't "wholly immaterial," and that having poor TDP is a handicap, i.e.: Vega.
The conversation morphed somewhere along the way. This was a conversation about desktop parts - the numbers I was quoting were for desktop parts. In my head we were conversing about desktop parts. In the desktop space - 50 watts is wholly inconsequential.

If the discussion is expanded to mobile computing - yes power use matters for obvious reasons (battery, heat, power brick size, fan noise) - but NOT for cost.
 
I wonder how much he needs to crank the fan to maintain a clock speed higher than 1500.

Fan isn't running hard? - I can't hear it in the vid - or maybe it's just my HP laptop speakers I can't hear the fan on. :)
 
Fan isn't running hard - you can't hear it in the vid. Or maybe it's just my HP laptop speakers I can't hear the fan on. :)

I know, Ryan did say the fan are running low atm, but curious if he needs to maintain 1500+ clock speed, how much will he have to crank the fan to do it.
 
1c689956f7ecd43511beafc17a027055.png


LOL!
 
looks like their polygon throughput really was fixed :ROFLMAO:

So far gaming its going from 1070 to 1080 level performance, just like the previous ones.
 
Idle power on Vega is...

"hovering between 14 and 16 watts"

in Witcher 3

"30 watts from PCI-Express and total is 300 watts"


'After running Witcher 3 benchmark - 25 watts from PCI-E - 130 watt draw and 135 watt draw from two different eight pins.'
 
The conversation morphed somewhere along the way. This was a conversation about desktop parts - the numbers I was quoting were for desktop parts. In my head we were conversing about desktop parts. In the desktop space - 50 watts is wholly inconsequential.

If the discussion is expanded to mobile computing - yes power use matters for obvious reasons (battery, heat, power brick size, fan noise) - but NOT for cost.

nVidia's mobile chips for Pascal, unlike for Maxwell, are desktop cards, but downclocked. So for nVidia Pascal cards, mobile and desktop are essentially the same.
 
I am not sure this makes sense?

Did they make the arch less efficient to push clocks but didn't make it as far as they wanted?

Same number of cores (4096), 40% higher clock than Fury X, but it's not matching up?
 
GCN architecture just on its last legs. We saw the same thing with nV's G80 three generations of products with little changes per generation and the same thing happened with the gtx 2xx.
 
so a 1080 - in the same comparison uses 43 cents in 24 hours at full tilt (or 1.8 cents per hour)
as mentioned earlier 300 watt Vega will use 72 cents of electricity in 24 hours of full tilt

If you game with two hours on each that's 6 cents for Vega, or 4 cents for 1080.

And your point of not using full TDP for gaming is fair -- which makes it even less for both cards -- if we use 66% TDP as a estimate then it's 4 cents for Vega for two hours of gaming and 2.5 cents for 1080 for two hours of gaming.

Still wholly immaterial -- and all the concern otherwise is folly.

Yeah let me just spend more on an inferior product that's slower and draw more power.

I tried to bring AMD into the green by buying a RX 550 for my gaming server (only needed to display video), but it looks like my $110 is futile.
 
Back
Top