PC Perspective Radeon Vega Frontier Edition Live Benchmarking - 4:30PM CDT

image.png

Really though, the only thing this proves is that PCPer is run by idiots who would rather live-stream than do things right and triple-check their results :D

Who knows, the Vega might be suffering from similar performance hit. Or was Vega installed in a different machine? I didn't watch the whole stream.
 
The problem is by the time Vega cards are available to regular consumers (gaming) if they match 1080 performance, Nvidia can lower the price to be competitive and then just release their next architecture dramatically surpassing it. AMD needed to release comparable performance 6 months ago or they need to release performance passing a 1080Ti for the same money now.

There either needs to be a cost or performance benefit if they want to reclaim market share.
 
I'm much more inclined to believe their gaming results, something that do every day!

Vega FE = GTX 1075?
 
So looks like the Vega is getting whipped by the 1080 Ti and losing out overall to the 1080. Not really sure what to make of this. So the card is a "prosumer" card for workstation graphics and seems to be good at that especially for the price but it's not using certified drivers. So for $1k you get a good card for that purpose but still not really a professional card and not at all worth the money for gaming. Hmmm. Wonder how it is for mining?
 
I'm not in the market anyway but of course we all should wait for consumer cards and independent reviews. I do think it's going to take time for developers to release patches or code for a second architecture.

In 2-3 years when I get a new card I hope there is some healthy competition.
 
Nvidia blames the monitor and panel makers about the washed out colors. It can be fixed, see here:

http://www.pcgamer.com/nvidia-cards-dont-full-rgb-color-through-hdmiheres-a-fix/

Sure would be nice if they could fix this in the drivers.


They did, some years ago. It's even linked to in the article you posted:

thagerty on 2014-12-24 at 18:12 said:


Well can you believe it?!?!? NVidia FINALLY implemented this feature in their most recent drivers. (347.09)

In NV Control Panel, Adjust Desktop Color Settings:

Now you can toggle Digital Color Format (RGB/YCBCR444) and Dynamic Range (Full / Limited).
 
regardless if frontier edition is good at gaming or not, my expectation of RX Vega performance is beating the 1080 by a convincing margin 10-20%, but so far FE is 10% slower than the 1080, which means RX needs to pull over 30% more performance compared to the FE, i don't think it's doable.
looking at the power and knowing glofo's 14nm, the extra performance wont come from clocks, im surprised they even managed 1600mhz, and since ryan said the drivers are up to date for like a month or so prior to validation, the best RX might get is 5-10% extra, that would put it on par with 1080, if it's the case, vega RX is better be coming at 399$.

Once it was revealed that Vega had the exact same core config (4096:256:64) as Fiji, I don't think it was reasonable to expect it to beat GTX 1080 by a significant margin. Also they were hitting some thermal throttling at 1440 MHz.
 
Sooo... you could say the Radeon Vega is the Connor McGregor to nVidia´s Titan Mayweather. You wish the Vega will win with all your heart, but deep down you know it´s gonna get embarrassed.

*Immediately runs off to put on flame retardant suit*
 
Well it's under performing the P5000 more often than not, and you'll notice that the P4000 is right on Vega's tail as well and even outperforms it in some cases. The Quadro series have application certification, Vega FE does not. At the end of the day even matching the P5000 regularly is a pyrrhic victory because they of how the die sizes and power draw figures compare.

AMD stock is taking a hit

View attachment 29101

Edit: stock appears unrelated, NV taking similar hit as well

Dude, the whole Nasdaq has been taking a hit the last few days. One review about a pre-release Vega isn't causing a panic sell.
 
They did, some years ago. It's even linked to in the article you posted:
I had been using the patch exe to fix it, good to know that it is fixed now and in the control panel. It's actually under "Adjust video color settings" on my monitor, it may be different if you have a TV? The patch still works, it is better changing it in the control panel, thanks for the info!
 
Just WTF is a certified driver, and what is stopping AMD to get it 'certified' soon?
 
Just WTF is a certified driver, and what is stopping AMD to get it 'certified' soon?

A certified driver is a driver with hardware acceleration enabled for some professional applications. This have nothing to do with gaming.

And AMD already have certified drivers.

The certification essentially covers testing, support etc on an enterprise level.
 
Just WTF is a certified driver, and what is stopping AMD to get it 'certified' soon?

it cost a lot money and resources (Time, work, dedicated workteam) to gain that pro certification, a pro certification mean that you are safe from crash, bluescreens, failed compilations, and large amount of incompatibilities with an even more large amount of issues.

you need certified drivers for pro applications, each one of those require a certification. in example. VEGA FE is crashing/hanging in blender app using PRO drivers on the GamerNexus Review.


so yeah.. those kind of issues.
 
I'm much more inclined to believe their gaming results, something that do every day!

Vega FE = GTX 1075?


so as to not offend fanbois, GTX 1079.... but....whos counting :p, I mean really I get its a workstation class card, and ppl will ride that as far as they can, but its not performing in line with current offerings that it SHOULD BE competing with, and I think the vast majority of us knocking it on gaming performance, realize its not 100% a gaming card, but we also realize, that RX Vega consumer version will not be significantly faster... by any stretch of the imagination.
 
Its good performance, it will come down to price for the gaming cards. If the memory is really 160 bucks for 16gb they should be switching to gddr5 yesterday.
 
I feel disappointed with these results, but not suprised. I can't help but wonder what differences, if any, that the RX Vega will have compared to this card. Only time will tell when the RX Vega gets reviewed with decently optimised drivers.
Also, does anybody know if AMD has another version up its sleeve with better performance, in a similar way to Nvidia with its Ti cards ?
I am waiting to buy a new card and was originally planning on a 1080 or 1070, but I will hold off until RX Vega is tested and then decide.
 
Its good performance, it will come down to price for the gaming cards. If the memory is really 160 bucks for 16gb they should be switching to gddr5 yesterday.

They need the help of HBM to keep power on control and feed the bandwidth starving architecture GCN is.
 
I feel disappointed with these results, but not suprised. I can't help but wonder what differences, if any, that the RX Vega will have compared to this card. Only time will tell when the RX Vega gets reviewed with decently optimised drivers.
Also, does anybody know if AMD has another version up its sleeve with better performance, in a similar way to Nvidia with its Ti cards ?
I am waiting to buy a new card and was originally planning on a 1080 or 1070, but I will hold off until RX Vega is tested and then decide.
I don't find them surprising either the card read a lot like a one chip fury card ( the fury card was 2 chips in a board right?)
 
300watts, can't outperform a 1080, priced higher than a titan xp, but semi professional drivers.

That about sum it up? 500$ for a aio water cooler for extra bling.

Trying to say wait for gaming versions of vega? They're just less memory versions. It's not like they're going to pull a miracle and get 150% performance greater than their frontier edition.
 
I'm starting to wonder if they ran into issues and made a major revision or fix - outside of an HBM2 delay. Which could mean Vega FE is them offloading this old revision, as Raja in his Reddit AMA stated RX Vega would be "way faster" than Vega FE.
 
I'm starting to wonder if they ran into issues and made a major revision or fix - outside of an HBM2 delay. Which could mean Vega FE is them offloading this old revision, as Raja in his Reddit AMA stated RX Vega would be "way faster" than Vega FE.
Vega2 rushed out? Could be, but i doubt it, that would be a rabbit out of a hat if there ever was one.
 
I'm starting to wonder if they ran into issues and made a major revision or fix - outside of an HBM2 delay. Which could mean Vega FE is them offloading this old revision, as Raja in his Reddit AMA stated RX Vega would be "way faster" than Vega FE.

You mean this quote:

"RK: Consumer RX will be much better optimized for all the top gaming titles and flavors of RX Vega will actually be faster than Frontier version!"

"Will actually be faster" is not the same as "way faster".
 
Vega2 rushed out? Could be, but i doubt it, that would be a rabbit out of a hat if there ever was one.

Vega 20 is an entirely different product. Its Vega 10 with FP64 added essentially besides 4 stacks og HBM2, "7nm" and GMI links. And that one is hard to imagine coming before 2019.
 
Vega 20 is an entirely different product. Its Vega 10 with FP64 added essentially besides 4 stacks og HBM2, "7nm" and GMI links.
That sounds more doable to me... Not that i know squat . So yeah it could be... Is there a definite date for gaming vega now or not yet?
 
This could have something to do with matching your HDMI display level output....0-255 or 16-235. Full/limited settings.

However - I've had the same experience. I spent years trying to get Nvidia cards to look right on my Projectors - specifically a Pansasonic AE500, and AE800, and a Epson 8350, and a Panasaonic AE8000U. I never got it quite right - and that's with hours and hours of tinkering and guides and walk throughs. Nvidia always looked a bit washed out to me.
I put in a AMD 285 card and BAM problem fixed. I went to a Fury X card and BAM problem still fixed. I can't explain it - it should be a setting match that I should have been able to do - but I never could get the Nvidia cards to look as natural as the AMD cards. That being said - the difference is subtle - and unless you are a video nazi - I'm not sure most people would even notice, but it was always bothering me. I did not have that problem with my various monitors (Dell 3014, HP 27", Toshiba 32" HDTV, etc).

There are multiple reasons why anything but latest Nvidia cards can look 'crap' compared to AMD cards for discerning users and or niche use cases.

All 9 series and below didn't do higher than 8bpc outside of DX applications, so no DX, you get 8 bit pleb colour banding. The way it's meant to be played eh!
It was disgusting to see them do that to be honest and really put me off ever owning one.
HDMI2.0 the major point of contention for many on Fury - X. Well.. the HDMI2 standard can't do 4k60/10bit/4:4:4. It doesn't have enough bandwidth.
So strike 2. But better than the Hdmi 1.2 on FX.
But next to a decent 10bit Dp screen, can be noticeable. Once you see it, you can't unsee it - banding is nasty, immersion breaking to me at times in game or video.
I've also had major issues with Nvidia and older professional projectors and other custom resolutions, long cable runs etc. Cheap ATi/AMD card no problems.



re:Vega: Although the results are disappointing for a card of that price, I still feel it would be best to test and judge conclusively gaming on a gaming card and not this dual driver, launch hokey weirdness. AMD launch drivers are almost always shitty. Except W10, where it was Nvidia having BSODs for once.

That said they're damned if they delay it longer and damned if it's slow.. tricky and shitty situation to be in when R&D limited.
 
I'll admit that this doesn't look particularly good, but is this Frontier Edition really reflective of final consumer cards?

While in this particular generation Nvidia has Quadro cards that perform right about on par with a 1080, this hasn't always been the case.

Is it possible this thing simply isn't pushed as high clock wise or core wise as a final top end consumer model would be?
 
Last edited:
There are multiple reasons why anything but latest Nvidia cards can look 'crap' compared to AMD cards for discerning users and or niche use cases.

All 9 series and below didn't do higher than 8bpc outside of DX applications, so no DX, you get 8 bit pleb colour banding. The way it's meant to be played eh!
It was disgusting to see them do that to be honest and really put me off ever owning one.
HDMI2.0 the major point of contention for many on Fury - X. Well.. the HDMI2 standard can't do 4k60/10bit/4:4:4. It doesn't have enough bandwidth.
So strike 2. But better than the Hdmi 1.2 on FX.
But next to a decent 10bit Dp screen, can be noticeable. Once you see it, you can't unsee it - banding is nasty, immersion breaking to me at times in game or video.
I've also had major issues with Nvidia and older professional projectors and other custom resolutions, long cable runs etc. Cheap ATi/AMD card no problems.



re:Vega: Although the results are disappointing for a card of that price, I still feel it would be best to test and judge conclusively gaming on a gaming card and not this dual driver, launch hokey weirdness. AMD launch drivers are almost always shitty. Except W10, where it was Nvidia having BSODs for once.

That said they're damned if they delay it longer and damned if it's slow.. tricky and shitty situation to be in when R&D limited.
Or more simple its that ATI dna on them. Ati cards as i remember were better in 2d and video than most others.
 
Ok, so its not a powerhouse on performance, but lets at least wait for the price before throwing stones at RTG.
I would rather pay 40k for a mustang then 150k for a ferrari..
 
  • Like
Reactions: N4CR
like this
Back
Top