AMD Vega prices to cost significantly more than expected

I have been around since '01 and well remember your "not trolling" threads bashing amd.
I just will never understand rooting against the underdog, but people cheer for winners I guess.
 
There is a difference between rooting for the underdog based on merit, then rooting the underdog as a method of fighting against the power that is nVidia/Intel.

I don't think anyone of us can deny that Vega is far too late to the party compared to its competition. Fury was prempted by 980ti by two weeks and were fighting neck to neck, so for a while, AMD had a matching product against nVidia at every price point, barring everything above 980ti.

But this Vega which performance competes against 1070 and 1080 only managed on the scene when 1080ti has already been out for several months and its competition out for a full year.

As much as I want to support AMD, Vega is 6 months too late. It also really didn't help when JHH gifted their first 15 V100 cards to AI researchers.

If we only took Vega on its own and literally nothing else, yes, 1080 now finally has a competition, but the fact remains that 1080ti still don't look like there is a competition for a while, which is especially gloomy if Volta yields the same amount of performance increase over Pascal, Vega would literally be relegated to 2070/2060 competitors, and 2080 will repeat itself again.

The only thing that is really floating AMD's boat right now is due to FreeSync, which makes all this even more disappointing, as one of the monitors I have my eyes on is a FreeSync monitor, and probably would have seriously considered jumping if Vega was 1080ti matching, 1080 matching makes it a much less appealing proposition.
 
There is a difference between rooting for the underdog based on merit, then rooting the underdog as a method of fighting against the power that is nVidia/Intel.

I don't think anyone of us can deny that Vega is far too late to the party compared to its competition. Fury was prempted by 980ti by two weeks and were fighting neck to neck, so for a while, AMD had a matching product against nVidia at every price point, barring everything above 980ti.

But this Vega which performance competes against 1070 and 1080 only managed on the scene when 1080ti has already been out for several months and its competition out for a full year.

As much as I want to support AMD, Vega is 6 months too late. It also really didn't help when JHH gifted their first 15 V100 cards to AI researchers.

If we only took Vega on its own and literally nothing else, yes, 1080 now finally has a competition, but the fact remains that 1080ti still don't look like there is a competition for a while, which is especially gloomy if Volta yields the same amount of performance increase over Pascal, Vega would literally be relegated to 2070/2060 competitors, and 2080 will repeat itself again.

The only thing that is really floating AMD's boat right now is due to FreeSync, which makes all this even more disappointing, as one of the monitors I have my eyes on is a FreeSync monitor, and probably would have seriously considered jumping if Vega was 1080ti matching, 1080 matching makes it a much less appealing proposition.

ASUS Vega X2
 
ASUS Vega X2
That's a dual core GPU, not in the same league as 1080ti.

When I talk GPU, I always talk single core GPUs. Dual Core GPUs are just 2 Single cores crossfired on the same PCB, otherwise they are the exact same as running 2 separate Vega Cards.
 
That's a dual core GPU, not in the same league as 1080ti.

When I talk GPU, I always talk single core GPUs. Dual Core GPUs are just 2 Single cores crossfired on the same PCB, otherwise they are the exact same as running 2 separate Vega Cards.
DX12 and Vulcan changed those rules.
 
DX12 and Vulcan changed those rules.
I still yet have to see any results of those labour.

Besides, those are still DX12 and Vulkan games only, so far an extreme minority of games.

Once more games under DX12 and Vulkan gets released, then I'll see about whether my statement holds true, for now, for all intents and purposes, I still consider Dual core GPUs to be nothing more than 2 GPUs sharing the same PCB.
 
DX12 and Vulcan changed those rules.

The only rule they changed was that using 2 or more GPUs became 10x more expensive and complex. Hence support in reality dying for good.

And who wants 2 cards/dual card that only work in a tiny amount of overpriced and poor selling games. While the other, specially the much better selling and much better indie type games doesn't support it and act like a single GPU.
 
DX12 and Vulcan changed those rules.

Everyone is moving away from multi GPU, so I’m not sure they changed anything. Also, who wants to play 3-5 games? 90%+ of games are still DX11 and/or run better in DX11. By the time it becomes relevant I can just go ahead and buy a new GPU that is much faster than current gen.
 
Everyone is moving away from multi GPU, so I’m not sure they changed anything. Also, who wants to play 3-5 games? 90%+ of games are still DX11 and/or run better in DX11. By the time it becomes relevant I can just go ahead and buy a new GPU that is much faster than current gen.
Or you can leave your GPU in and buy any other GPU and compliment the first in performance. No more generational or model lock ins. Full RAM accessible on all cards. No more red team only or green team only. Why not one of each? Oh and throw onboard video processing power into the mix too for a couple more FPS. The future of MGPU is very different indeed.
 
Last edited:
The only rule they changed was that using 2 or more GPUs became 10x more expensive and complex. Hence support in reality dying for good.

And who wants 2 cards/dual card that only work in a tiny amount of overpriced and poor selling games. While the other, specially the much better selling and much better indie type games doesn't support it and act like a single GPU.
For now.

BUT, the future of MGPU a few years down the road, when the commo game engines support it as a base function is much brighter than that.
 
Or you can leave your GPU in and buy any other GPU and compliment the first in performance. No more generational or model lock ins. Full RAM accessible on all cards. No more red team only or green team only. Why not one of each? Oh and throw onboard video processing power into the mix two for a couple more FPS. The future of MGPU is very different indeed.

That does raise the interesting point on whether it affects Freesync/G-Sync, or importantly and from a positive perspective even if it actually then means one is no longer tied to one over the other and able to choose the VRR monitor they always wanted.
Of course that would require MGPU to be universal support for all modern games for it even to be consumer viable (if mixing GPUs and VRR techs works well).

CHeers
 
Or you can leave your GPU in and buy any other GPU and compliment the first in performance. No more generational or model lock ins. Full RAM accessible on all cards. No more red team only or green team only. Why not one of each? Oh and throw onboard video processing power into the mix two for a couple more FPS. The future of MGPU is very different indeed.

Ok. In 2 years we can come back to this post and you can let me know all about how great mGPU tech is now. I'll bookmark it.
 
Lol. I won't need to. In two years you'll know yourself.

My point is we will see who is right. I think mGPU is going away, since Nvidia doesn't seem all that interested in developing for it and neither does AMD. Didn't we have something like this before, LucidGPU or something? That failed as well.
 
My point is we will see who is right. I think mGPU is going away, since Nvidia doesn't seem all that interested in developing for it and neither does AMD. Didn't we have something like this before, LucidGPU or something? That failed as well.
Well mGPU is meant to be moving towards API rather than driver, so makes sense that interest will lower from AMD and Nvidia but this should be picked up then by API developers such as Microsoft (DX12) and Khronos (Vulkan).
Comes down to how much the API developers commit to the concept of mGPU, but unfortunately may still be some time away before they really go all-in.

Cheers
 
For now.

BUT, the future of MGPU a few years down the road, when the commo game engines support it as a base function is much brighter than that.

So was SLI and CF. Always just down the road.

When a developer has to spend a lot of money and time for a ultra niche its not happening.
 
Its like the gold rush for graphics cards due to the mining craze. Sucks alot when you have to build computers for family/friends and they are paying sometimes 100 bucks extra. UGG.
 
  • Like
Reactions: N4CR
like this
they've said it would be at or above the GTX 1080 and it is besides civ 6 so the performance isn't really that low since it ended up exactly where AMD said it would. but i'll wait for gameplay benchmarks instead of time loops. but yeah the power usage looks a bit insane..

Not just Civ6, also Metro. Are you counting 4K only? And its a 699 liquid cooled version that is at best around a stock old 499 1080FE in average at half the power? Get a 1080TI for the same price and still save power while getting much more performance? ;)

I dont think I seen such a bad GPU release the last 20 years.

A broader 20-30 games test will be even worse.
 
Last edited:
After seeing the middling performance and crazy power draw, I was really hoping AMD was going to bring a bit more balance back from the price angle.

The Freesync rational was really the last meager thread I could possibly see being made for anyone getting Vega, but no way that washes at these prices. Even miners are going to steer clear at this point with the power draw and less than stellar hash rate.

Very, very happy I stopped waiting and picked up a nice 1080 ti (from someone who was holding out to "upgrade" to Vega) last month.
 
My bad, wrong link. Here you go:
https://videocardz.com/71984/first-review-of-radeon-rx-vega-64-air-and-liquid-is-here

Called it ages ago as a glorified Fiji.

Radeon-RX-Vega-64-2.jpg


Radeon-RX-Vega-64-5.jpg





b6b4f6095944affde44975b3a8cd23c8_1502679281_7538.jpg

Vega-64-BF1.jpg

380 watts stock damn. That is the new record for most wattage for a single gpu graphics card to compete against nV's 3rd best, this is the r600 all over again, actually worse.
 
Basically GTX 1080 performance using a lot more power and likely no OC headroom? Also, a lot more expensive? Am I missing something?
 
Basically GTX 1080 performance using a lot more power and likely no OC headroom? Also, a lot more expensive? Am I missing something?

Basically, they have 1080 performance at 10 bit hardware support versus Nvidia 8 bit hardware support. That is one thing many folks seem to miss which is one reason I stick with AMD. I am not going to buy a multi hundreds of dollars piece of hardware that looks completely washed out on my monitor and have to mess with settings to maybe get something that is still worse looking than AMD, at least on my monitors.
 
Basically, they have 1080 performance at 10 bit hardware support versus Nvidia 8 bit hardware support. That is one thing many folks seem to miss which is one reason I stick with AMD. I am not going to buy a multi hundreds of dollars piece of hardware that looks completely washed out on my monitor and have to mess with settings to maybe get something that is still worse looking than AMD, at least on my monitors.


Pascal should have it set up for 10 bit color. Unless I'm mistaken. I don't have an HDR monitor so that setting doesn't show up for me. But nV has been pushing HDR monitors with Pascal cards so...... They also talked about at launch too.
 
Pascal should have it set up for 10 bit color. Unless I'm mistaken. I don't have an HDR monitor so that setting doesn't show up for me. But nV has been pushing HDR monitors with Pascal cards so...... They also talked about at launch too.

Could be but, I had a 980Ti and found it was a waste of money at the $650 I had paid for it. An R9 380, R9 290X and R9 Fury all looked better than the 980 Ti did on my setup. For those would want to claim placebo, you are wrong. On a different note, this was on my setup with my Samsung 4k monitor that I am using now and have for the past 2 years, maybe Pascal has true 10 bit support, I have no idea.
 
Performance is not low and they do not cost more.

599/699$ for sub or just 1080 performance at crazy power draw 15 months later. :D

Those 4x perf/watt, NCU, tile based and whatever really shows. The person saying it was a glorified Fiji was so wrong...oh wait!

The ETH rate was barely even 1/3rd of the claimed as well. And you better like noisy cards if you buy reference Vega. 5-8dB more than a 1080FE.

#waitfornavi (Yes, it will be 4096SP too).

There is always the "cinematic" effect in 4K GTAV.
90083.png
 
Last edited:
599/699$ for sub or just 1080 performance at crazy power draw 15 months later. :D

Those 4x perf/watt, NCU, tile based and whatever really shows. The person saying it was a glorified Fiji was so wrong...oh wait!

The ETH rate was barely even 1/3rd of the claimed as well. And you better like noisy cards if you buy reference Vega. 5-8dB more than a 1080FE.

#waitfornavi (Yes, it will be 4096SP too).
https://www.newegg.com/Product/Product.aspx?Item=N82E16814137226

Vega 64 card $499.99 (yes OOS), but yes the rest of the air-cooled cards listed there are $599.99.
 
Basically, they have 1080 performance at 10 bit hardware support versus Nvidia 8 bit hardware support. That is one thing many folks seem to miss which is one reason I stick with AMD.
You sound confused or just grasping at straws. 1080 most definitely outputs 10bit.
 
1080 most definitely outputs 10bit.

Coincidentally this came up in another thread at https://hardforum.com/threads/consi...d-quadro-for-10-bits.1941528/#post-1043160681 just recently. The conclusion is that indeed NVidia supports 10bit on 1080 (and probably also on several older generations), but only in Direct3D in exclusive fullscreen mode (and not in windowed mode or in OpenGL). That is what most Windows games use though, so one could say NVidia supports 10bit on Pascal for games, but not for business and content creation applications. Curious to see if that will change later with HDR going mainstream.
 
With regards to pricing, is there any reason in particular why the GPU manufacturers can't accept that the GPU market is simply larger now? Why couldn't they simply ramp up production to accomodate the miners? The way I see it, for every crypto-currency that becomes unprofitable, a couple new ones emerge to take their place. It even seems like they are being created so that consumer-grade GPUs can mine them. There may not be a way to beat them.
 
With regards to pricing, is there any reason in particular why the GPU manufacturers can't accept that the GPU market is simply larger now? Why couldn't they simply ramp up production to accomodate the miners? The way I see it, for every crypto-currency that becomes unprofitable, a couple new ones emerge to take their place. It even seems like they are being created so that consumer-grade GPUs can mine them. There may not be a way to beat them.

Its all about logistics and mining being very volatile. It takes ~12 weeks just to make the chips. So lets say 15-18 weeks from you decide to increase production unexpected till you actually have a product in the stores. By then mining may be gone again and you sit on increased inventories.

This is also why AMD and Nvidia have tried to sell else faulty trashcan bin chips as mining cards. Its money out of trash and it doesn't affect the logistics chain.

No sane miner however will buy Vega.
 
Coincidentally this came up in another thread at https://hardforum.com/threads/consi...d-quadro-for-10-bits.1941528/#post-1043160681 just recently. The conclusion is that indeed NVidia supports 10bit on 1080 (and probably also on several older generations), but only in Direct3D in exclusive fullscreen mode (and not in windowed mode or in OpenGL). That is what most Windows games use though, so one could say NVidia supports 10bit on Pascal for games, but not for business and content creation applications. Curious to see if that will change later with HDR going mainstream.

I have followed this quite in-depth since 780Ti (as 10Bit can benefit some of my work) and you are correct. And I hate banding with a passion. Although newer monitors are getting better at this across the board with 10bit LUT etc.

Regarding topic at hand, I was right. I said they'd mark it up at least in my country for launch (they did) but I also expected the mining performance to be better. Seems everyone else is having fun marking it up globally..
I'm going to wait another month before deciding, see what AA improvements they can bring. Not being invested in any active sync gives me more options and might be able to save some $ buying in EU or DXB.
And fuck Gibbo. What a slimy motherfucker, hope his audience is pissed at him.
 
Its all about logistics and mining being very volatile. It takes ~12 weeks just to make the chips. So lets say 15-18 weeks from you decide to increase production unexpected till you actually have a product in the stores. By then mining may be gone again and you sit on increased inventories.

This is also why AMD and Nvidia have tried to sell else faulty trashcan bin chips as mining cards. Its money out of trash and it doesn't affect the logistics chain.

No sane miner however will buy Vega.

AMD boosted their sales of R9 290(x) and R9 390(x) by specifically having "TheStilt" aka an AMD engineer on community liaison service making officially AMD Golden Signature signed BIOS's optimizing them for memory latency sensitive hashing algorithms.

AMD specifically touted "cryptocurrency architecture enhancements" for the Vega in their slides.

I would bet that AMD will ninja release mining BIOS and release their own forks of the mining software with optimizations explicitly for VEGA if/when they have massive inventory of the card and no gamers buy it.

They won't likely release it before they have large inventories of VEGA that people aren't buying.

^^^^^^^^^^^^^^^^^^^^^^^^^^

This is of course all under the supposition that AMD isn't simply lying out of their ass like they always do and that they are competent, so take this analysis with a grain of salt.
 
Back
Top