PC Perspective Radeon Vega Frontier Edition Live Benchmarking - 4:30PM CDT

This isn't a pro card no ecc memory just like a titan card, meaning while it be fine for an individual station for say CAD work that not on 24/7 and has regular reboots i wouldn't trust it in a server like environment just like a titan card. Except titan is actually top performance in games.

Wait, even with HBM2, there's no included ECC? That kills the last justification for anyone buying this trash heap.

By the time this is available in quantity, the RX 580 availability will be solved for the miners, and you can get faster performance from two cards for half the price.

How far below that bullshit 1600 boost clock was this thing hitting? I knew that sounded "made up for marketing purposes" when the RX 580 can't even hit 1500.
 
Last edited:
Ouch....

I was hoping for 1080 at least, but not even that?

It's starting to look like they might as well just refreshed the Fury X (possibly with GDDR5 instead of HBM)...

nV did more with a microarch change than AMD did with uarch AND node shrink.....
 
Mining is their life-support.

Just to be clear, this is my GPU progression: S3 Virge -> Rage3D -> Voodoo3 3500 AGP -> Radeon DDR AGP -> Radeon 9800 Pro -> 8800 GTX / SLI / Tri-SLI -> Radeon 4890 2-way Crossfire -> Radeon 6950 2-way Crossfire -> 980Ti 2-way SLI. Currently I'm using one of the old 6950s for 2D waiting and saving up for a fast single card.

I don't want AMD's graphics division to die... but I believe they deserve to.


Yeah, I wasn't expecting AMD to come out with a 1080TI/TitanX competitor, but I was thinking they ought to be able to beat the base 1080.

This is a major disappointment. Not even sure why AMD would want Ryan Shrout to do this reveal at this point. It's not helping them any.


And since we are doing GPU progressions, here is mine:

Matrox Mystique 3D -> 3DFX Voodoo 1 6mb -> GeForce 2 GTS -> GeForce 3 TI500 -> GeForce 6800GT -> GeForce 9200 GT -> Radeon HD 5750 -> GeFroce GTX 470 -> GeForce GTX 580 -> Radeon HD 6970 -> 2x Radeon HD 6970 (Crossfire) -> Radeon HD 7970 -> GeForce GTX 680 -> GeForce GTX Titan -> 2x GeForce 980TI (SLI) -> Pascal Titan X
 
My prediction was that it MUST be at least 1080 to be considered a technological advancement, anything lower than that is basically a Fury X rehash with a die shrink.

I was HOPING for it being between 1080 and 1080ti, but it looks like it's not going to happen.

Still, I'll be open minded and point out that this is just 1 person doing that benchmark, hopefully more benchmarks will shed more positive light on Vega.
 
My prediction was that it MUST be at least 1080 to be considered a technological advancement, anything lower than that is basically a Fury X rehash with a die shrink.

I was HOPING for it being between 1080 and 1080ti, but it looks like it's not going to happen.

Still, I'll be open minded and point out that this is just 1 person doing that benchmark, hopefully more benchmarks will shed more positive light on Vega.

I think you are spot on, it seems to be a somewhat updated Fury X with more memory. Not what people were expecting. I will hope they can pull some driver magic, and have something special for the gamer edition, but I'd bet it won't do any miracles, and if so it will take a lot of effort for the driver team.
 
You wanted to be disappointed in real time?



Said it before and I'll say it again: The RX480 should never have been released. RX580 was flat out pathetic, a non-product. Most still hoping for the best from AMD/ATI looked to Vega -- "Oh, just wait, they'll compete again, they'll take the performance crown!"

Fat. Fucking. Chance.

When a $1000+ prosumer card can't compare to Nvidia's consumer "flagship" (and I'm referring to the Ti, not the Titan) selling for $300+ less, AMD has nothing. Nothing. The "this isn't marketed towards gamers" nonsense is misdirection pure and simple, smoke and mirrors bullshit. If they can't produce a card that competes at this price level, there is absolutely no way in hell they'll deliver something competitive at a lower price point.

Step up or sell out, AMD. Put the money in to R&D, prove you're still relevant, or sell your patents to someone who gives a fuck and go die in a corner.

I'm tired of waiting. We're all tired.

Nope dude, I said exactly what I meant and meant exactly what I said. However, I tried watching bits and pieces of it and found myself happy that I missed it. Talk about a snooze fest. AMD cards have looked far better than Nvidia on the 4k monitor I own, simple fact. I prefer AMD and have personally found Nvidia to be a waste of my money, simple as that. None of those cards are worth $700 or more dollars for me.
 
Nope dude, I said exactly what I meant and meant exactly what I said. However, I tried watching bits and pieces of it and found myself happy that I missed it. Talk about a snooze fest. AMD cards have looked far better than Nvidia on the 4k monitor I own, simple fact. I prefer AMD and have personally found Nvidia to be a waste of my money, simple as that. None of those cards are worth $700 or more dollars for me.

Oh boy, did someone just say an AMD card "looks better" on a digital connection using HLSL based games?

Prepare your anus, my friend: the flames are coming...
 
i remember after the 290 release you said they (AMD) has something BIG coming, after the next series (so not the 300).

the 400 series wasnt anything high end and this looks to match something that released 13 months ago (for probably more than the 1080 is now)

what happened? was it supposed to be vega, or was polaris supposed to be real good?
The 400 was planned to clock much higher than it did. I think I wrote something about it once...
 
I think you are spot on, it seems to be a somewhat updated Fury X with more memory. Not what people were expecting. I will hope they can pull some driver magic, and have something special for the gamer edition, but I'd bet it won't do any miracles, and if so it will take a lot of effort for the driver team.

There is no logic to that. A slightly updated Fury X that is even with or slightly slower than a 1080? That would be far more than slightly updated. Also, lets say it was only slightly faster than a Fury X, where is the logic in that when it would use more power on a smaller node than a Fury X? Heck, one my my Fury Non X cards would most likely be faster than the results he posted on that video.
 
There is no logic to that. A slightly updated Fury X that is even with or slightly slower than a 1080? That would be far more than slightly updated. Also, lets say it was only slightly faster than a Fury X, where is the logic in that when it would use more power on a smaller node than a Fury X? Heck, one my my Fury Non X cards would most likely be faster than the results he posted on that video.

If you take FuryX and somehow clocked it to 1600 MHz, you would have essentially identical performance to what this card is offering.
 
Oh boy, did someone just say an AMD card "looks better" on a digital connection using HLSL based games?

Prepare your anus, my friend: the flames are coming...

LOL. *Slips on flame retardant suite and hides behind a bunker* :D Hey, the results may not be the same for others by my results were repeatable on my setup, whether others like it or not. (R9 290, R9 380 and now R9 Furies in crossfire. The EVGA 980 Ti looked washed out on my monitor and the other cards did not, at all, just the way it is.) I have no issues if someone wants to own Nvidia and it is worth it for them but, it is not for me and I learned that the hard and expensive way.

Edit: On my setup, it was not just games but the desktop as well. I tried ignoring it but after a while, it was just not something I could ignore anymore and switched back to AMD, which I am happy for. (I just timed it a week or two before I should have because the card I bought was $80 or so less on sale at around $280 and I bought a Fury for $360, just that my timing stinks. :D)
 
If you take FuryX and somehow clocked it to 1600 MHz, you would have essentially identical performance to what this card is offering.

You could be right but then, how does that explain the power consumption of a much smaller node?
 
There is no logic to that. A slightly updated Fury X that is even with or slightly slower than a 1080? That would be far more than slightly updated. Also, lets say it was only slightly faster than a Fury X, where is the logic in that when it would use more power on a smaller node than a Fury X? Heck, one my my Fury Non X cards would most likely be faster than the results he posted on that video.
Yes, I might be off on my estimation of how much of an increase it was from Fury X. I am mostly an Nvidia user for quite a while, the last AMD card I bought retail was a 3870, the rest have come used from eBay for very good prices. I'd like AMD to be closer to Nvidia, or we all soon will either run intel integrated or pay $1000 and up. Doesn't sound fun to me, so I actually am fairly agnostic, whoever has the best price/performance gets my business.
 
Still, I'll be open minded and point out that this is just 1 person doing that benchmark, hopefully more benchmarks will shed more positive light on Vega.

No.

Nope dude, I said exactly what I meant and meant exactly what I said. However, I tried watching bits and pieces of it and found myself happy that I missed it. Talk about a snooze fest. AMD cards have looked far better than Nvidia on the 4k monitor I own, simple fact. I prefer AMD and have personally found Nvidia to be a waste of my money, simple as that. None of those cards are worth $700 or more dollars for me.

You're high as shit, son.
 
LOL. *Slips on flame retardant suite and hides behind a bunker* :D Hey, the results may not be the same for others by my results were repeatable on my setup, whether others like it or not. (R9 290, R9 380 and now R9 Furies in crossfire. The EVGA 980 Ti looked washed out on my monitor and the other cards did not, at all, just the way it is.) I have no issues if someone wants to own Nvidia and it is worth it for them but, it is not for me and I learned that the hard and expensive way.

Edit: On my setup, it was not just games but the desktop as well. I tried ignoring it but after a while, it was just not something I could ignore anymore and switched back to AMD, which I am happy for. (I just timed it a week or two before I should have because the card I bought was $80 or so less on sale at around $280 and I bought a Fury for $360, just that my timing stinks. :D)
Nvidia blames the monitor and panel makers about the washed out colors. It can be fixed, see here:

http://www.pcgamer.com/nvidia-cards-dont-full-rgb-color-through-hdmiheres-a-fix/

Sure would be nice if they could fix this in the drivers.
 
LOL. *Slips on flame retardant suite and hides behind a bunker* :D Hey, the results may not be the same for others by my results were repeatable on my setup, whether others like it or not. (R9 290, R9 380 and now R9 Furies in crossfire. The EVGA 980 Ti looked washed out on my monitor and the other cards did not, at all, just the way it is.) I have no issues if someone wants to own Nvidia and it is worth it for them but, it is not for me and I learned that the hard and expensive way.

Plecebo is a strong effect. I love AMD ( not so much now, but I have nostalgic memories) but the argument that they "look better" was based on the old analogue/CRT days when the ATI RAMDACs were far superior to Nvidia. With digital connections, Windows says "draw 0.05, 0.92, 0.765" and the GPU sends those exact values down the cable. The only exception is when a gamma scaler is used at the driver level, which is entirely user-editable, so you are more apt to say "AMD's default Gama settings are more appealing to me".
 
You could be right but then, how does that explain the power consumption of a much smaller node?

60% increased clocks.. that's it. die shrunk Fury X, lot of power saved by the shrink process however skyrocketed by clocking it higher than what it's able to sustain.. (not even 1600mhz, but ~1400mhz) so that's it. just a shrunk Fury X.
 
This is marketed as both types, workstations and gaming, and has a driver selectable mode for each.

Yes and no. AMD actually put themselves in a strange position with this one. It runs both drivers, and for the intended audience, needs to.

So is it "optimized for gaming"? Well, the silicon certainly is as much as it will be. Drivers will improve, and history shows AMD takes a long time to do this. In fact, they make a marketing message out of it, comparing it to wine.

Thanks! I was unaware that this card was meant to cater to both.
 
Mining at 44MH/s on a single card is nice.

$1000 for said card is not.

can you post where they said it was mining at 44MH/s ? because i saw 30MH/s ETH, IDK if I missed that 44MH/s on unknown miner app.
 
Edit: On my setup, it was not just games but the desktop as well. I tried ignoring it but after a while, it was just not something I could ignore anymore and switched back to AMD, which I am happy for. (I just timed it a week or two before I should have because the card I bought was $80 or so less on sale at around $280 and I bought a Fury for $360, just that my timing stinks. :D)

Having gone from 980Ti SLI to a 6950 while waiting for the GPU price madness to end, I can confirm I've had exactly the opposite experience. The 980Tis looked great with my calibrated Samsung 6290 40" 4K TV. The 6950 (granted, at 1080P since it's incapable of native res) is washed out to the point of introducing serious eye strain.
 
This is marketed as both types, workstations and gaming, and has a driver selectable mode for each.


From what was said on the livestream all gaming mode does is show the regular video card options in the amd control panel, nothing more.
 
Plecebo is a strong effect. I love AMD ( not so much now, but I have nostalgic memories) but the argument that they "look better" was based on the old analogue/CRT days when the ATI RAMDACs were far superior to Nvidia. With digital connections, Windows says "draw 0.05, 0.92, 0.765" and the GPU sends those exact values down the cable. The only exception is when a gamma scaler is used at the driver level, which is entirely user-editable, so you are more apt to say "AMD's default Gama settings are more appealing to me".

Good thing it is not and was not a plecebo then. :D 6 months with the 980 Ti and washed out colors, I was sick of it and no way I should have had to deal with that on a $650 video card! Oh well, I decided not to deal with it anymore, good for me. :D
 
Nvidia blames the monitor and panel makers about the washed out colors. It can be fixed, see here:

http://www.pcgamer.com/nvidia-cards-dont-full-rgb-color-through-hdmiheres-a-fix/

Sure would be nice if they could fix this in the drivers.

Yep, I was and am using Display port but the idea is still the same, Nvidia had an issue and pointed fingers. Not something to deal with on a $650, my choice, that is just that. On the other hand, I will not be buying Vega or upgrading to any other card until 2019, not really any point and I prefer to keep my cards at least 3 years or so.

Edit: I also did that image test that they provided and I do not appear to have that problem, YES! http://imgur.com/Bw61l68
http://imgur.com/Bw61l68
Thanks for the link, it is good to know.
 
Wow this is pretty sad. What a waste of HBM2 supplies. Nvidia is going to run away with the market. Everyone who pre-order need to cancel/return it asap. This going to be a sub $500 card soon.
 
I remember when I was google-fuing stuff on monitor calibration that I learned AMD cards in general plays nicer with monitor calibrations than nVidia due to the fact that AMD cards can use ICC profiles on a hardware level so even games that completely ignore desktop colour profile settings (DX9 games are pretty bad for this) can use your calibrated ICCs. nVidia GPUs specifically can't do that.

It took me bloody ages and countless trial and errors, that I managed to get my IPS and TN monitors to look similar colourwise and not look washed out. My TN uncalibrated is a mess next to my uncalibrated IPS, but afterwards they look more or less the same.
 
The one good thing I can say about this card is that it does swap blows with the Quadro P5000 in certain applications, which is a $2000 card.

https://www.extremetech.com/computing/251780-amd-radeon-vega-frontier-edition-benchmarks

VegaFE-Chart1.png


Of course, that required you to put up with the fact that the drivers aren't certified. That will come in FirePro packaging, with a higher price.

So good for professional applications, assuming you don't mind rolling the compatibility dice.
 
LOL. *Slips on flame retardant suite and hides behind a bunker* :D Hey, the results may not be the same for others by my results were repeatable on my setup, whether others like it or not. (R9 290, R9 380 and now R9 Furies in crossfire. The EVGA 980 Ti looked washed out on my monitor and the other cards did not, at all, just the way it is.) I have no issues if someone wants to own Nvidia and it is worth it for them but, it is not for me and I learned that the hard and expensive way.

Edit: On my setup, it was not just games but the desktop as well. I tried ignoring it but after a while, it was just not something I could ignore anymore and switched back to AMD, which I am happy for. (I just timed it a week or two before I should have because the card I bought was $80 or so less on sale at around $280 and I bought a Fury for $360, just that my timing stinks. :D)

This could have something to do with matching your HDMI display level output....0-255 or 16-235. Full/limited settings.

However - I've had the same experience. I spent years trying to get Nvidia cards to look right on my Projectors - specifically a Pansasonic AE500, and AE800, and a Epson 8350, and a Panasaonic AE8000U. I never got it quite right - and that's with hours and hours of tinkering and guides and walk throughs. Nvidia always looked a bit washed out to me.
I put in a AMD 285 card and BAM problem fixed. I went to a Fury X card and BAM problem still fixed. I can't explain it - it should be a setting match that I should have been able to do - but I never could get the Nvidia cards to look as natural as the AMD cards. That being said - the difference is subtle - and unless you are a video nazi - I'm not sure most people would even notice, but it was always bothering me. I did not have that problem with my various monitors (Dell 3014, HP 27", Toshiba 32" HDTV, etc).
 
Last edited:
Back
Top