From ATI to AMD back to ATI? A Journey in Futility @ [H]

Discussion in 'AMD Flavor' started by Kyle_Bennett, May 27, 2016.

  1. spine

    spine 2[H]4U

    Messages:
    2,220
    Joined:
    Feb 4, 2003
    Indeed, still waiting, but I have to say, something seriously feels up with AMD's graphics division right now...

    Make no mistake, the future of RTG is in serious doubt. RX Vega = 1080. We all know it, and so the only choice AMD now has is this:

    1. Don't release RX Vega.
    2. Release it and seriously undercut nvidia's equivalent. Polaris all over again basically.

    Frankly, if I were AMD, option 1 would be the wisest choice, and then spin off RTG. With the complete lack of current AMD stock on shelve, and no confirmation at all of any RX Vega products, and/or any indication that they'll even sell significantly, it feels to me that what Kyle originally said is pretty much going to play out.
     
  2. WorldExclusive

    WorldExclusive [H]ardForum Junkie

    Messages:
    10,748
    Joined:
    Apr 26, 2009
    I'm waiting on the follow up article to this.
    All signs are pointing to futility.
     
  3. SighTurtle

    SighTurtle [H]ard|Gawd

    Messages:
    1,132
    Joined:
    Jul 29, 2016
    FFS, unless Raja's rebellion is truth and they still want to spin off, why would AMD, who spent the last 2 years+ basically saying, "We are competitive, we have CPUs and GPUs, please give us money" spin off RTG now? Does anyone understand how bad it would look? Confidence would collapse in AMD, AMD would lose its theoretical potential in HPC, GPU markets, and still need GPU tech for their CPUs! Why give that tech away? No, unless RTG really thinks it can find a new home, AMD is going to take the damage, and move on, promising Navi, and hopefully relying on their CPU sales to get the money needed to invest in RTG again.
     
    razor1 likes this.
  4. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,711
    Joined:
    Jul 14, 2005
    Added to that RTG's IP is across all brands, what will AMD do if they spin off RTG? Lose consoles, lose APU's, lose their semi custom business? They will lose a lot if they spin off RTG. Just have to swallow the pill and hope their CPU's can cover RTG.
     
    Magic Hate Ball likes this.
  5. Maddness

    Maddness Gawd

    Messages:
    706
    Joined:
    Oct 24, 2014
    Well, there CPU division looks to be heading in the right direction. Just wish there GPU's were doing the same.
     
  6. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,711
    Joined:
    Jul 14, 2005
    Well its an extended problem, with RTG faltering, they won't be able to expand it quickly nor increase R&D. Pretty much the CPU division now has to cover all of RTG's R&D, expenses, and future chips on top of the debt, while increasing its own R&D for future CPU's. Its still an uphill battle, not as bleak as before, but still.
     
  7. Drewis

    Drewis Limp Gawd

    Messages:
    428
    Joined:
    Jul 25, 2006
  8. scojer

    scojer 2[H]4U

    Messages:
    2,842
    Joined:
    Jun 13, 2009
    These things, they take time.
     
    razor1 likes this.
  9. SickBeast

    SickBeast [H]Lite

    Messages:
    69
    Joined:
    Jan 29, 2012
    My real concern for RTG is that Volta will be the death blow.
     
  10. sine wave

    sine wave n00bie

    Messages:
    54
    Joined:
    Oct 10, 2015

    Glad you have those concerns, but I don't see AMD loosing too much sleep over the "Gaming" version of Volta. That is roughly 10~11 months away. Vega is going to have free reign in the gaming market thru the 2017 Holiday. Cheap 4k FreeSync2 monitors for everyone, even the basement dwellers. OPhra will be giving them away on TV, FS2 craze for young gamerz. ergo: AMD will have mindshare when Volta hits.

    But, honestly if AMD is sandbagging their "control fabric" side of things, then Volta might not even stand a chance against a Vega x2. Which I think will be released some time in October. Still in time for the Holidays, and it would place AMD on top of the GPU wars, and about 40% out in front of Tital Xp (Pascal). Not forgetting, that Navi uarch is soon coming down the pipe and the transition into the RX Vega sku is going to mean a possible Vega x4 (RX4). I really don't think Volta @ 800mm^2 is going to be able to compete with AMD direction and strategy. Even as facetious as that may sound, it is entirely plausible knowing ALL WE KNOW so far about AMD & their technology. I am just reacting to the cadence of AMD's tick-tock cycles between their various platforms.

    RX Vega will be a hit for gamers & a blow for Nvidia.





    ~ sine wave ~
     
  11. SickBeast

    SickBeast [H]Lite

    Messages:
    69
    Joined:
    Jan 29, 2012
    So you think AMD's answer to Volta should be a dual Vega card? LOL.
     
  12. madgun

    madgun [H]ard|Gawd

    Messages:
    1,734
    Joined:
    May 11, 2006
    Dual GPU is not equal to single GPU in direct comparison.

    Single GPU performance and consistency is much more preferred hence most people like single fastest cards over dual GPUs all day long.

    If one prefers playing benchmarks all day then dual GPUs suffice lol.
     
    Last edited: Jul 27, 2017
    razor1, Chimpee and SickBeast like this.
  13. Digital Viper-X-

    Digital Viper-X- [H]ardForum Junkie

    Messages:
    13,304
    Joined:
    Dec 9, 2000
    I don't care if it's dual, single, or triple GPU on a single card, if it performs it performs and you can compare it on a price level/power/performance level then.
     
  14. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,711
    Joined:
    Jul 14, 2005

    Well that is the thing currently it just won't, it has to use Xfire and right now with most engines using differed rendering, it functions like a single gpu solution. Cost for such a solution added to the pitfalls further decreases its effectiveness. We saw this with all Dual GPU tech in the past. The only time it was feasible was prior to differed renderers. For VR it wil come in handy but that market is so small right now again cost of manufacturing might be detrimental for the results
     
  15. CSI_PC

    CSI_PC 2[H]4U

    Messages:
    2,243
    Joined:
    Apr 3, 2016
    And to add there is still a sizable market for dual GPU on single card and specifically the more lucrative HPC segment; Nvidia still sells its venerable dual GPU K80 that seems to have a reasonable market due to its FP64 and price (AMD also has a product that competes well in this regard for price and performance).
    So like you say performance and price within acceptable power envelope that is usually 300W.
    Cheers
     
  16. WorldExclusive

    WorldExclusive [H]ardForum Junkie

    Messages:
    10,748
    Joined:
    Apr 26, 2009
    No one believes what you wrote. lol

    Free AMD reign over the 2017 Holidays, when Nvidia has been selling cards at ridiculous profits unchallenged for over 4 years now.
     
  17. wrangler

    wrangler 2[H]4U

    Messages:
    3,951
    Joined:
    Jan 17, 2005
    AMD cannot make graphics (mining) cards fast enough. Not even NVidia can claim to sell every single mid to upper tier GPU they make. AMD can.
    I don't thing NVidia has the R&D advantage that people think it has.
     
  18. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,711
    Joined:
    Jul 14, 2005

    ALL 1070 are completely sold out, and 1060's (6gb) are now too. My newegg business rep told be their supply managers have limited the sales of all 1070's, 1060 6gb, rx 570's and rx 580's 1 per customer. This also goes to regular newegg consumers all because of the mining craze. Hence why I started looking else for cards for mining.

    nV has a huge R&D advantage at least double, AMD needs to split their R&D between CPU and GPU and its total R&D for both divisions are equal to nV's R&D budget for just GPU's.
     
    Factum and SickBeast like this.
  19. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,711
    Joined:
    Jul 14, 2005

    Although it won't be a death blow per say, its going really limit AMD's market share growth in dGPU markets, all of them, from HPC, DL (which are non existent right now), all the way to consumer. AMD needs to be competitive pretty much in the performance market and down if they don't' want to see their market share slip any further. Will Vega 11, 12 be able to compete with Volta's mid range? We already know what will happen in the performance segments, that is a forgone conclusion. Dual GPU Vega will NOT compete against these cards, it can't even compete against Pascal.

    If AMD is able to do something similar with Vega 11 and 12 as they did with Polaris 10, they might be able to stem off the market share bleeding, but I think that is a tough call to answer with what we know about Volta and what we see with Vega currently. They have to get oh at least 60% better in perf/watt to match Vega in the mid range, probably closer to 100%, at 60% increase they will match Pascal's performance segment, which Volta's mid range should be around Pascal's performance segment in performance and factoring in lower power consumption, yeah 100% seems likely is what Vega 11 and 12 have to reach to match Volta.
     
    SickBeast likes this.
  20. SickBeast

    SickBeast [H]Lite

    Messages:
    69
    Joined:
    Jan 29, 2012
    I think the end result of Volta is that it's going to relegate AMD to the midrange market, the consoles, and also APUs. I don't think AMD will be able to contend in the high end market for some time to come, if ever. There is always hope for Navi, and if this "infinity fabric" truly is scalable, then yes, just maybe they can strap four midrange parts together and pull something off. I will believe that when I see it, though.
     
    razor1 likes this.
  21. SalemDE

    SalemDE n00bie

    Messages:
    1
    Joined:
    Jul 30, 2017
    Still don't quite understand why haven't they just slammed two Polaris GPU's on a single card with internal CrossfireX. Or just shrunk the Fury X and clock it higher and call it a day. I would probably be nothing in terms of R&D compared to RX Vega as whole new GPU design and they'd essentially get this kind of performance anyway.

    that fault is only for Raji since that his first card out to the public. he fucked up so bad and he knows it since the first BS Poor Volta PR thing.

    step aside Raji you are way far from leading critical brand, maybe an engineering/advisor/senior. but ATi needs another Jim Keller to save what's left by doing real shit not just talking and promising
     
    SickBeast likes this.
  22. pendragon1

    pendragon1 [H]ardness Supreme

    Messages:
    6,280
    Joined:
    Oct 7, 2000
  23. Araxie

    Araxie [H]ardness Supreme

    Messages:
    5,842
    Joined:
    Feb 11, 2013
    That's basically in short words VEGA..

    VEGA is not a new GPU Design, not a new Architecture.
     
    razor1 and CSI_PC like this.
  24. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,711
    Joined:
    Jul 14, 2005
  25. sciguy9

    sciguy9 n00bie

    Messages:
    7
    Joined:
    May 27, 2016
    <Facepalm> and I never JUST read a headline. I'm going to blame that one on the 35° + weather ;-)
     
    razor1 likes this.
  26. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,711
    Joined:
    Jul 14, 2005
    np man it happens :)
     
  27. juanrga

    juanrga [H]ard|Gawd

    Messages:
    1,168
    Joined:
    Feb 22, 2017
    1) Very interesting article. It is worth to mention those Su-Koduri leading problems aren't something totally new inside AMD. The fusion of AMD and ATI was never 100% complete and often people working at AMD liked to organize themselves into GPU guys and CPU guys. Some APUs were delayed due to lack of communication between CPU groups and GPU groups, which forced one group to re-desing when the other group changed something.

    2) I didn't read the thousand comments, but the Intel CPU with Radeon graphics exists. It is an MCM chip.

    3) Anxiously awating a CPU version of this article, explaining the Su-Keller fighting and how Keller finally left AMD.
     
    razor1 likes this.
  28. ecmaster76

    ecmaster76 [H]ard|Gawd

    Messages:
    1,032
    Joined:
    Feb 6, 2007
    Kaby Lake R looks interesting. Slipped out pretty quietly but seems to have some interesting video improvements:

    -Editing video footage is now up to 14.7x faster, so rendering what used to take 45 minutes on a 5-year-old PC, now takes three minutes.

    Oops. Nvm that's not compared to previous Kaby parts.
     
  29. jologskyblues

    jologskyblues [H]Lite

    Messages:
    82
    Joined:
    Mar 20, 2017
    The new mobile KB-L-R parts with Intel UHD Graphics have been announced. Still no Radeon graphics and no VESA Adaptive Sync standard support so it seems. I wonder when Intel will be able to come out with their own Variable Refresh implementation of the VESA Adaptive Sync standard since they've mentioned years ago that they were actively looking into the technology. I would imagine that if they choose not to integrate AMD GPUs as their processor graphics solution, apart from the hardware/software/driver work Intel has to invest in to implement Variable Refresh on their own iGPUs, they would still have to do the variable refresh certification process in collaboration with AMD in order to ensure proper operation with FreeSync certified Adaptive Sync monitors. Question is, are they willing to do that?
     
    Last edited: Aug 21, 2017
  30. bigdogchris

    bigdogchris Wii was a Novelty

    Messages:
    16,938
    Joined:
    Feb 19, 2008
  31. SighTurtle

    SighTurtle [H]ard|Gawd

    Messages:
    1,132
    Joined:
    Jul 29, 2016
    Why would AMD kill off Raven Ridge? This is a nonsense rumor. AMD will not help Intel kill their own APUs. Also Intel would never hint at AMD tech in their marketing.
     
  32. Darakian

    Darakian [H]ardness Supreme

    Messages:
    4,868
    Joined:
    Apr 12, 2004
    AMD wouldn't kill raven's ridge, but they would kill a good deal of laptops with geforces. Intel still commands the high end market, and a quad core intel chip with mid range vega sounds like a discrete gpu killer to me.
     
  33. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,711
    Joined:
    Jul 14, 2005

    won't happen they don't have the bandwidth to match up against a 1050 discrete GPU let a lone anything higher.

    It doesn't give anything more than Intel already has. It will kill off raven ridge for the most part, Those are the only laptops they can go after.

    IGP will never take on discrete market, it was talked about how many years ago and the dynamics of the market only changed at the ultra low end. And that is because IGP just don't have the bandwidth necessary to take on discrete anything above the ultra low end.

    Lets talk about die sizes need to get anything more than a couple hundred CU's in an APU, that isn't going to happen either, cause a person that is looking for graphics performance, will not want an APU/IGP. Just doesn't make sense. This also increases cost of the silicon too. Why would anyone match up a dual CPU with a monster APU or IGP? doesn't make sense right? Need a CPU to be able to feed them.
     
    Last edited: Oct 9, 2017
    Factum and juanrga like this.
  34. Montu

    Montu [H]ard DCOTM x4

    Messages:
    6,593
    Joined:
    Apr 25, 2001
    Factum, Algrim, razor1 and 1 other person like this.
  35. Darakian

    Darakian [H]ardness Supreme

    Messages:
    4,868
    Joined:
    Apr 12, 2004
    Intel does on package dram so.... memory access might be there. On CUs you've got a point, but even a 7850k with a better memory system is at least interesting and with updated cores it might even be compelling.
     
  36. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,711
    Joined:
    Jul 14, 2005
    on package dram will still need to capable of supplying the same bandwidth/performance ratios as desktop counterparts if they want the same performance otherwise they will have limitations.
     
  37. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    5,229
    Joined:
    Feb 22, 2012
    Companies do deals like this all the time so it's not that it can't happen.

    For Intel they could increase their performance and secure space held by discrete GPUs and AMD gets risk free profit from their anemic graphics sector.

    The question is does it really compete against AMD APUs. If they restrict Intel to the high end then I'd say no. Kinda feels like that portion of the market is too small for this to happen though.

    I'd say it'd be a win for AMD though. Everyone loves risk free profit.
     
  38. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,711
    Joined:
    Jul 14, 2005
    how is it going to go into the midrange or high end laptops? That means it will need to have the performance of the 1060, 1070, 1080, those laptop chips are damn close to the performance of desktop components. That is just crazy after what we saw what vega's power consumption is like to hit those performance levels.

    It also makes no sense, cause they already have that laptop business for CPU side of things. What is the incentive to just give money to AMD when they don't have to?

    The only company that would lose out would be nV if it was even possible from a thermal/power consumption/performance point of view. Cause Intel doesn't make money on the extra IGP portion, AIB's make extra money off of the discrete graphics cards so it hurts them too.
     
    defaultluser likes this.
  39. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    5,229
    Joined:
    Feb 22, 2012
    When I said high end I meant realitively for iGPUs/APUs.

    I agree with you though. I don't see it happening. Collaboration between competitors does happen all the time... when it makes sense ($$$). Usually if it opens up an untapped market.
     
    Last edited: Oct 9, 2017
    razor1 likes this.