Spitballing Vega GPU Performance

Discussion in '[H]ard|OCP Front Page News' started by Kyle_Bennett, May 17, 2017.

  1. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    46,470
    Joined:
    May 18, 1997
    Spitballing Vega GPU Performance

    This is an interesting article over at Tech Report that is "speculating" on AMD's new Vega GPU and its performance. Now keep in mind that Scott Wasson, the owner of Tech Report, works for AMD. So if I am going to jump on the rumor train with any site, it is probably going to be Tech Report.

    Either way, none of these dart throws suggest the eventual RX Vega will have what it takes to unseat the GeForce GTX 1080 Ti atop the consumer graphics-performance race, as some wild rumors have postulated recently. I'm willing to be surprised, though. We also can't account for the potential performance improvements from Vega's new primitive shader support or its tile-based Draw Stream Binning Rasterizer, both of which could mitigate some of these theoretical shortcomings somewhat.

    My sources in the industry have indicated a couple of different positions on how they think Vega will stack up against NVIDIA. My most trusted sources seem to think that it will edge out GTX 1080 by a slight margin, and I have a couple that think Vega will fall just slightly short of GTX 1080. What I am surely not getting from any sources is that Vega is a "1080 Ti killer." Wherever Vega lands, we certainly need AMD's Radeon Technology Group back in the mix at the high end even if it is not the ultra-high end.

    [​IMG]

    What we do know from AMD's past, is that its cards have done great in some applications like bitcoin mining, and even trounced NVIDIA when it came to comparative products. And while I was watching Raja demo Vega running DeepBench and being successful against NVIDIA in that demo, it did not give me huge warm and fuzzies about Vega and gaming. As always though, the proof will be in the pudding.
     
  2. odditory

    odditory [H]ardness Supreme

    Messages:
    4,166
    Joined:
    Dec 23, 2007
    Unless it beats a 1080 for $399, I don't see them selling too many. And if it does beat 1080 for $399, Nvidia will drop prices.

    It's like Nvidia checkmate no matter what, unfortunately. Ryzen created a space for itself because Intel's a lumbering dinosaur. But nVidia is no Intel, they move quickly, and always have something faster ready to drop at a moments notice.
     
    Last edited: May 17, 2017
  3. Gigantopithecus

    Gigantopithecus [H]ard|Gawd

    Messages:
    1,080
    Joined:
    Aug 6, 2009
    Thanks, Kyle. Any word on what kind of juice Vega's gonna need to sip to perform like the GTX 1080?
     
  4. Pusher of Buttons

    Pusher of Buttons 2[H]4U

    Messages:
    2,899
    Joined:
    Dec 6, 2016
    If they can compete at 1080 speed and pricing it'll be solid. 1080ti is a niche market when you take into the costs for a monitor that will actually do it justice. Whatever it is it needs to not bomb all your VR tests....
     
    Ninjaman67, Dunnlang and rgMekanic like this.
  5. Bandalo

    Bandalo 2[H]4U

    Messages:
    3,692
    Joined:
    Dec 15, 2010
    One, there's not going to be that many even available until they get their HBM supply chain worked out. Two, there's going to be a LOT of people who will buy them, regardless of performance, because they somehow feel they're "sticking it to Nvidia", who had the audacity to charge a lot of money for a fast video card.
     
  6. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    46,470
    Joined:
    May 18, 1997
    The card that Raja showed off yesterday had two 8-pin PCIe power connectors on it.

    vegafes.jpg
     
    maclem8223 likes this.
  7. Gigus Fire

    Gigus Fire [H]ard|Gawd

    Messages:
    1,768
    Joined:
    Oct 14, 2004
    I found this interesting comment from another site:
    "His graphic just says "time to complete DeepBench", but DeepBench consists of four different tests and each one can have vastly different configurations of the underlying matrices, networks, etc. (whatever is being used for that particular test). From what I see on the DeepBench site, his graphic doesn't mean much on its own. AMD always seems to pull that shit and it's getting annoying."
     
  8. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    46,470
    Joined:
    May 18, 1997
    I have never run DeepBench, but maybe if I get a chance.
     
  9. SighTurtle

    SighTurtle Gawd

    Messages:
    941
    Joined:
    Jul 29, 2016
    Kyle, a image of Raja holding it shows it having a 8 and 6 pin connector.
     
  10. grtitan

    grtitan [H]ard|Gawd

    Messages:
    1,108
    Joined:
    Mar 18, 2011
    I dont take anything away from nvidia, they have been on schedule with their releases and performance is great, but I personally hate how they included the telemetry drivers without customers consent and forcing an account creation for GFE, so if I can, I will let my money talk to express my disapproval.

    And yes, both intel and nvidia overcharged because they could and it will be very stupid for us to ignore that and just given them money because they are offering a temporary lower price.
     
    Mong00se likes this.
  11. Namx01

    Namx01 n00bie

    Messages:
    43
    Joined:
    Feb 4, 2008
    Probably no sipping, more like slurping, though in a good way :)
     
  12. Bandalo

    Bandalo 2[H]4U

    Messages:
    3,692
    Joined:
    Dec 15, 2010
    AMD would charge the exact same prices if they could. AMD's lack of competitive products is what let the prices go up for the only company left in the market. If AMD takes over with Vega/Threadripper and Intel and Nvidia don't respond, I'll bet you dollars to donuts Vega2/Threadripper2 see much higher prices. It's not corporate greed, it's pure supply and demand.
     
    HockeyJon and Seventyfive like this.
  13. grtitan

    grtitan [H]ard|Gawd

    Messages:
    1,108
    Joined:
    Mar 18, 2011
    And then when they become dicks, we go back to intel and nvidia, but not "helping" them now, just will screw us up for good.

    Already said in my original post.

    Already addressed in my post.

    The whole point of my post.

    I am not a corporate fanboi like many here, which should be clear from my post, I go for the customer first, but the blind sheeps dont help themselves and need some encouragement or different points of view.
     
    Cha0s, JackNSally and fs123 like this.
  14. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    46,470
    Joined:
    May 18, 1997
    My bad, I went by what is on the official website. I would suggest that is a better reference than the card he had on stage.
    http://pro.radeon.com/en-us/vega-frontier-edition/
     
  15. Bandalo

    Bandalo 2[H]4U

    Messages:
    3,692
    Joined:
    Dec 15, 2010
    My post said it's NOT corporate greed, it's supply and demand. Intel/Nvidia aren't being "dicks", they're charging what the market will bear for enthusiast products.

    I'm in it for performance. I personally am not going to accept lower performance just to help a "struggling" company get back on their feet. If they were a small business or something, it'd be different, but AMD is a huge multi-national corporation. You help them out now, and in 5 years when they're on the top, they'll reward you by charging the same high prices people complain about now.
     
    sleepeeg3, Fleat, GoldenTiger and 2 others like this.
  16. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    46,470
    Joined:
    May 18, 1997
    SighTurtle likes this.
  17. vegeta535

    vegeta535 Gawd

    Messages:
    922
    Joined:
    Jul 19, 2013
    That's fine and all if it is competitive with a TI but if it doesn't beat a 1080 then it is not in a good way.
     
  18. Nolan7689

    Nolan7689 Limp Gawd

    Messages:
    439
    Joined:
    Jun 5, 2015
    Why? Why would it be checkmate if they beat a 1080 for $100 less? If they released at the same price I could see Nvidia dropping their prices being problematic for AMD? But at a lower price I don't see it as a checkmate.
     
  19. vegeta535

    vegeta535 Gawd

    Messages:
    922
    Joined:
    Jul 19, 2013
    This so much. I can't believe how people think AMD is fighting the good fight. AMD is fighting to stay alive and be relevant. AMD would gouge people just as bad as Intel or Nvidia. People seem too forget the $1000 64 FX chips or even the $1000 9590 on launch.
     
  20. Chaos Machine

    Chaos Machine Limp Gawd

    Messages:
    397
    Joined:
    Apr 13, 2012
    When you are dropping $700 on a video card, $300 for a suitable display isn't much of a stretch. But yeah, it's for well heeled gamers.
     
  21. Pusher of Buttons

    Pusher of Buttons 2[H]4U

    Messages:
    2,899
    Joined:
    Dec 6, 2016
    $300 for a gsync display that'll do three 1080ti justice? Well over $700 for a basic 4k Gsync monitor.
     
  22. tungt88

    tungt88 [H]ard|Gawd

    Messages:
    1,720
    Joined:
    Jan 14, 2008
    Not necessarily "well-heeled", as I know quite a few car enthusiasts who have spent thousands (even tens of thousands) modding their car, yet they make a normal mid five-figures like most other people (admittedly they are eating PBJ, spaghetti, ramen, and canned food 50 weeks a year if single, and whatever their wife wants to cook, if not).

    While I have a nice income by US living standards, but am "gaming" on a 4K@60Hz TV that cost me $260 2 years ago, and am currently using a GTX 1070, which I got on sale (am planning to move up to the GTX 1080 Ti with the help of Newegg gift cards, and turn the GTX 1070 into my backup card).

    As for proported Vega gaming performance? As expected, but the differential from it's expected 1080 results range mean that I, most likely, will be getting a GTX 1080 Ti in the summer (no reason to move from a 1070 to 1080ish performance).
     
  23. grtitan

    grtitan [H]ard|Gawd

    Messages:
    1,108
    Joined:
    Mar 18, 2011
    Same as mine, yet you insist in changing it to that. Already said that much in my first post.

    Already said that in my first post and adding, my last 3 or so cards, including my present, were nvidia because of that reason.

    Hence my quotes in "helping" and calling intel and nvidia dicks. But again, you insist in not reading my comment as I posted it.

    Never said that, but like the other guy, insist in misreading my posts.

    Yet, conveniently, my main complaint, their telemetry spyware is conveniently ignored in this forums.
     
  24. odditory

    odditory [H]ardness Supreme

    Messages:
    4,166
    Joined:
    Dec 23, 2007
    Well you can already get a 1080 for close to $400. Nobody's paying MSRP. There are already tons of them out there - used, new, rebates galore, while AMD is likely to struggle with the usual constrained supply issues for months post-launch, which will leave pricing high.

    Checkmate.
     
    Last edited: May 17, 2017
    tungt88 likes this.
  25. Rvenger

    Rvenger [H]ard|Gawd

    Messages:
    1,217
    Joined:
    Sep 12, 2012
    Since I am driving 4k, looks like my next GPU is Volta.
     
  26. DuronBurgerMan

    DuronBurgerMan Gawd

    Messages:
    621
    Joined:
    Mar 13, 2017
    Very happy with my 1080 Ti. Ironically, my main monitor is just a 1080p 45" TV. Nothing special. Folks would probably think the Ti is overkill. But I'll probably be replacing the monitor with a 4k unit soon.

    Either way, I tend to stick with a GPU for 4+ years if I can swing it (last card was a Radeon 7970), so the 1080 Ti grants me a long lifespan before needing to upgrade the GPU. If AMD had an option in between the 1080 and 1080 Ti, for an in between price, I'd have been very interested in that. But no competition at the high end when I was ready to build, so Nvidia it was.
     
    tungt88 likes this.
  27. DejaWiz

    DejaWiz Oracle of Unfortunate Truths

    Messages:
    17,626
    Joined:
    Apr 15, 2005
    Now that the 1080Ti is out and some rough details about Vega have been emerging, I'm changing my previous statement.

    I'm fully expecting (read: assuming) the top end Vega to land somewhere between the 1080 and 1080Ti, all test environments being equal.
     
  28. tungt88

    tungt88 [H]ard|Gawd

    Messages:
    1,720
    Joined:
    Jan 14, 2008
    Yeah, the good performance and massive RAM amount is what steered me towards the GTX 1080 Ti.
    I play a lot of Fallout 4 w/the HiRez pack + tons of mods, and the RAM usage, as measured by MSI Afterburner, easily goes into the 7.8GB+ range, even hitting 8035 at one point.
    When the RAM usage hits into the 7.6+ area, I start to get some stuttering -- that's something that I think will be handled well by the GTX 1080 Ti's 11GB.
    That 11GB should also provide a bit of headroom for the next 1-3 years.
     
  29. Filiprino

    Filiprino Limp Gawd

    Messages:
    226
    Joined:
    Mar 16, 2011
    Still, lack of information on Vega. Rehashed HBCC results + concrete products for HPC and Deep Learning.
    They will probably sell 8GB HBM2 consumer card. With the HBCC it will work as if it was at 16GiB GDDR5X with current way of doing things. When programming evolves it should be more rewarding.

    Computexxxxxx.
     
  30. Anarchist4000

    Anarchist4000 [H]ard|Gawd

    Messages:
    1,585
    Joined:
    Jun 10, 2001
    At 4k or higher resolutions, where compute performance matters more going by Fury scaling, it's hard to imagine a game taking advantage of FP16 not performing far better on Vega than a 1080 with well over 3x the FP16(FP32 on 1080) performance. Console devs at the very least will support it and it's practically required for any engine also targeting mobile. That's leaving out overclocking, tiling, geometry, DX12/Vulkan/async advantages, and anything else that may have been added. Just looking at that theoretical gap I'd expect it to outperform the 1080ti. Setting the bar low seems to have been the goal.

    Obviously the cores did get reworked going off the higher clocks that have been demonstrated. While just speculating, I'm eyeing the Ryzen FPU design exploded out to 16 lane SIMD and a flexible scalar or two on the side to pick up integer, scalar, and other costly work. Defining a clock as a MUL operation instead of FMA would probably net the clockspeed increase we've seen. If the budget is tight reusing parts from Zen also makes sense and SMT is a viable GPU technique we haven't seen used. Everyone keeps assuming the traditional 2 ops/clock is MUL+ADD, but MUL+MUL is a possibility along with accumulators. That works for DL and graphics and is far more compute capability than the raw TFLOPs figure would suggest.

    It's also possible they designed it not to pull any power through the PCIE slot given all the IO on Naples and Polaris issues. At x16 a single system could have 8 GPUs attached, and I wouldn't rule out 16 discrete(Falconwitch) or even dual cards for density.
     
    DigitalGriffin likes this.
  31. odditory

    odditory [H]ardness Supreme

    Messages:
    4,166
    Joined:
    Dec 23, 2007
    "When programming evolves". Heh. More pie in the sky wishfulness, in a world where engine developers will continue to target lowest common denominator. The last eye roller VRAM speculation was the "DX12 will stack VRAM with multiple GPU's!" line that people were so fond of here. And of course it turned out to be total nonsense.
     
    GoldenTiger likes this.
  32. Gigantopithecus

    Gigantopithecus [H]ard|Gawd

    Messages:
    1,080
    Joined:
    Aug 6, 2009
    Thanks again, Kyle. Assuming your sources are correct in that the most powerful Vega will perform similarly to the GTX 1080, and since many stock to slightly overclocked GTX 1080s have a single 8-pin and the more aggressively overclocked GTX 1080s usually have an 8-pin plus a 6-pin, then a 1080-equivalent Vega with two 8-pin connectors implies Vega is not as power-efficient as Pascal.

    That's disappointing, especially considering Vega is a year behind Pascal. But it is heartening in that at least AMD has reasonable competitors for the 1070 and 1080 now.
     
  33. shad0w4life

    shad0w4life Limp Gawd

    Messages:
    503
    Joined:
    Jun 30, 2008
    I tried gaming on a 1080p TV with a 1070 with Black Desert, Witcher 3, and Dark Souls 3, felt sick after a few minutes, so so much screen tear and it is almost amplified it seems, really really noticeable in DS3 where it's very look around fast a lot.

    G-Sync/Freesync is a must for playing games, and Gsync you pax a hefty fee for some pretty crappy monitors. Right off the bat nvidia is a lot more for a proper game playing setup ontop of their already higher prices.
     
  34. Syntax_Error

    Syntax_Error [H]Lite

    Messages:
    76
    Joined:
    Nov 30, 2016
    It's kinda funny how no one talked about GFE telemetry and spyware when AMD was letting a third party collect their user data through the Gaming Evolved app.

    Customers: Why don't we have something like the GFE?

    AMD: Err... hold on... ahh wait, wait... there you go, the awesome GAMING EVOLVED. Just as good, no wait, actually much better than GFE.

    Customers: Yaaay. But wait, it is being force installed. We want a choice.

    AMD: Really? C'mon. Okay whatever, here... take your choice, and remember that we love you, teehee.

    Customers: It's not really doing anything useful, and waitaminute, is it mining data from us for RAPTR?

    AMD: WHAT? NO WAY! Well kinda. Maybe just a tiny bit.

    Customers: We don't want it.

    AMD: (grumble grumble) Eh? Okay whatever.

    Customers: Why don't we have something like the GFE?

    AMD: Grrr, it has telemetry y'know. Which is like this evil unholy thing that will rob you of your soul and eat your babies.

    Customers: O... M... G!!!


    Here is an alternative: take OFF the tinfoil hat.
     
    Factum and GoldenTiger like this.
  35. yyv_146

    yyv_146 [H]Lite

    Messages:
    108
    Joined:
    Jun 16, 2016
    Yep. I ran some napkin math for current GPU price-to-performance ratios and just bought a pair of 1080's for some CUDA work. $850 for an amazing 18 TFLOPs of performance. The hypothetical best scenario involves three 1070's but that's a much more dicey proposition for many reasons.

    Assuming that you can efficiently use multi-GPU setups, the 1080 is simply a great deal. If otherwise, Vega will need to either be <$400 or capable of going against the 1080ti. That's possible though - and in particular, it might overclock better in a reference design.
     
  36. GoldenTiger

    GoldenTiger 3.5GB GTX 970 Slayer

    Messages:
    17,304
    Joined:
    Dec 2, 2004
    Already seeing new gtx 1080 cards in the $420 to 440 range on deals. And that's not even with Nvidia price dropping anything yet. Also, Volta is almost here...
     
  37. Bandalo

    Bandalo 2[H]4U

    Messages:
    3,692
    Joined:
    Dec 15, 2010
    Perhaps I'm just not reading your post properly. I understand your complaint about telemetry. What I don't understand is your stance on this issue. My original point was they (Nvidia/Intel) weren't "overcharging", they were charging what the market would bear. I don't think they should be demonized for that.
     
    GoldenTiger likes this.
  38. fs123

    fs123 Kyle is a Bad Motherfucker

    Messages:
    92
    Joined:
    Jun 26, 2010
    That render is of the Professional Frontier Edition. The gaming card will probably have an 8-pin + 6-pin or less.
    The early Doom demo system was running a Vega card with just one pci-e connector.

    Check the psu @ 4:55 in this video. It only has 2 cables connected. One is the Hard drive sata connector.

     
    Last edited: May 17, 2017
    Gigantopithecus likes this.
  39. zerogg

    zerogg Limp Gawd

    Messages:
    148
    Joined:
    Jan 26, 2017
    A bet the miners will reserve alot of that initial batch. They might really shine in that area.
     
  40. SighTurtle

    SighTurtle Gawd

    Messages:
    941
    Joined:
    Jul 29, 2016
    It helped that AMD's Gaming Evolved was utter trash and therefore no one bothered to care.
     
    Cha0s likes this.