AMD Releases more Carrizo Details: AMD's ISSCC 2015 Presentation

Discussion in 'AMD Processors' started by cageymaru, Feb 23, 2015.

  1. daglesj

    daglesj [H]ardness Supreme

    Messages:
    5,096
    Joined:
    May 7, 2005
    Erm the consoles would still sell if they had dog shit in them as long as they managed the same job. The consoles are NOT AMD's product, they are Sony and Microsofts, two brands most of the world has heard of.

    Consumers are not buying the consoles for the APU in them. They buy them to play games. AMD got the gig cos the console manufacturers knew they could get a good deal and that Nvidia would no longer do business with them.

    AMD was the last cheap date left in the club! Look at whats written on the APUs....AMD doesnt get a look in. They don't even get to put their branding on them.

    If you asked 100000 console owners why they bought their One/PS4 I bet the following phrase -

    "Oh man I had to buy them due to the super cool AMD APU in it...I'm just crazy for their lithography!"

    Would never be said.

    Plus at the end of the day, both the One and the PS4 are very disappointing tech wise. Not as cutting edge as the previous gen was at release.
     
    Last edited: Mar 4, 2015
  2. defaultluser

    defaultluser [H]ardForum Junkie

    Messages:
    12,531
    Joined:
    Jan 14, 2006
    Right, I only pointed out Intel as an in-depth part of the discussion because they're in the exact same industry, and they're making money while AMD isn't. I believe I also mentioned a bevy of other electronics/software companies that know how to build and market products and convince end-users that they need them.

    Apple is doing incredibly well for not having the easy-out of selling Windows. Since they gave-up on PowerPC, they've doubled their marketshare - which is impressive considering they have not had to sacrifice their incredible margins.

    http://www.theregister.co.uk/2014/0...d_laptop_market_share_dips_below_90_per_cent/

    They keep the ads flowing, and keep offering a quality product ecosystem that nobody else can match, and MORE PEOPLE KEEP PAYING FOR NEW MACHINES. That sounds like the exact opposite problem AMD has, yet both are CURSED with this terrible x86 architecture AMIRITE?

    The ceiling on total Apple OSX sales are always going to be limited by the price they sell their hardware at. But they don't care, because they would rather have these incredible margins than 90% marketshare with the majority being low-margin sales. This is the same issue that plagued Dell and HP's PC divisions, prompting one to go private to try and go up-market like Apple (Dell), and the other tried to sell off the PC division some years back (HP).

    Consoles don't make AMD money, because they are a component supplier, not the dealer of the final product. The dealer of the final product makes money on the sale of the hardware (later in life when costs fall), and the sale of software + accessories + services, which is really lucrative. In some cases AMD may only receive a one-time payment for use of it's IP, or if they negotiate a per-sale fee it will be something small.

    A pure component supplier doesn't make big profits. Want to know how Samsung suddenly started being top-of-the-game in profit growth the last few years? They started selling and heavily promoting high-end smartphones with ridiculous profit margins, which is the same method Apple used to grow monstrous. This is despite the fact that they sourced every single ARM chip Apple sold in every Iphone through the 5s...all those sales were low-margin, so they decided to get in on the high-margin cell phone game too.

    YOU DON'T MAKE SEROUS PROFIT SELLING COMPONENTS THESE DAYS, unless you have a technological advantage on a feature everyone wants, like Qualcomm and LTE.
     
    Last edited: Mar 4, 2015
  3. daglesj

    daglesj [H]ardness Supreme

    Messages:
    5,096
    Joined:
    May 7, 2005
    Indeed it's like saying -

    The only reason people buy the Apple iPhone is due to the wonderful ARM chips inside them.

    Oh totally. No other reason. :rolleyes:
     
  4. defaultluser

    defaultluser [H]ardForum Junkie

    Messages:
    12,531
    Joined:
    Jan 14, 2006
    Speaking of companies that know how to make money, Nvidia today is pushing the Shield Console:

    http://anandtech.com/show/9047/nvid...nsole-tegra-x1-android-tv-box-shipping-in-may

    HDMI 2.0 support + 4k HEVC decode = the first external box to potentially stream 4k Neflix content (has not been confirmed, but is the obvious target). This is impossible on HTPCs, and up until this product was only possible using the crappy built-in apps on 4k TVs. Nvidia sees a chance to be the Roku of the 4k generation (with a much better gaming experience), and they are jumping on it. Voice search is also an obvious ploy for the media streaming market here.

    Native Google Play plus the long list of Android game ports they've already made over the years (plus a few new games for this launch) makes it better than any other micro-console out there. And thanks to the Maxwell GPU, they have the potential for 1080p for newer games (probably lower resolution, but likely competitive with the Xbox One).

    The promise of full-quality PC game streaming, assuming they put server boxes in your neighborhood so the lag is low enough.

    A $200 console with tantalizing features that nobody else has = very tempting. There are a lot of people who have not been willing to trade-up from their Xbox 360 due to the high price of the One, so if they can make the pitch attractive enough, this could be the Wii of this generation: cheap, does everything you want out of a media box, and has the potential for impressive streamed games if they can build enough distributed server infrastructure to give everyone low lag. It's no surprise that their pitch is squarely aimed at the Xbox 360, which this massively outperforms.

    See, THAT is how you build a product and sell it. Nobody buying your incredibly powerful SoC? Turn it into a console, and use your industry connections to get game ports. Hardware + Software + Accessories = serious potential money, if people bite. AMD would never think of doing such a thing even though they went and made two capable SoCs for two console makers. Really, Kabini was powerful enough for this when it was released ~2 years ago. They don't have the vision, the balls or the cash or the industry connections, and that is why they are failing.

    What does AMD do with their inexpensive gaming-tastic platform? They make a socketed version to follow the race to the bottom, because they can't think outside their little PC box.
     
    Last edited: Mar 4, 2015
  5. Rizen

    Rizen [H]ardForum Junkie

    Messages:
    9,220
    Joined:
    Jul 16, 2000
    Sorry but I think the fact that the Ouya failed miserably at a much cheaper price point doesn't bode well for NVIDIA here. 4K streaming content is nice but how many people have connections that will handle 4K streaming or 4K TVs? It's a selling point but that alone won't move units. Streaming is nice but Valve has a PC streaming device that will allow gamers to stream any game from their own PC for $50.

    This is just NVIDIA's attempt to recoup costs on developing the Tegra X1, since they continue to fail at securing any design wins in the mobile space with their processors.
     
  6. pcjunkie

    pcjunkie 2[H]4U

    Messages:
    2,602
    Joined:
    Jun 11, 2012
    The home entertainment/console space is already saturated. I don't know why Nvidia thought it wise to jump in the game this late unless it has no other options. Seems like a money pit to me?
     
  7. Martelol

    Martelol n00b

    Messages:
    14
    Joined:
    Nov 17, 2014
    Agreed. The shield is ultimately not going to be any more useful for the vast majority of the market than any other Android TV box. From a tech nerd perspective, I look at it and think "That's kinda neat" and not much else. It's almost certainly the most powerful Android TV box, but android tv boxes only need to be powerful enough to play video and navigate the UI smoothly.

    A handful of AAA titles that have been out multiple years being ported to android and running at 30fps does not make me want a shield. Gamestream does not make me want a shield when Valve's streaming hardware will do the same thing for 1/4 of the price with any vendor's GPU later this year. The shield is hilariously out of touch with the market it's entering.

    The X1 might be impressive, but the product they're launching it with is pointless.
     
  8. daglesj

    daglesj [H]ardness Supreme

    Messages:
    5,096
    Joined:
    May 7, 2005
    Well at least they are trying and also raising their market profile.

    Kind of more than AMD is doing.
     
  9. defaultluser

    defaultluser [H]ardForum Junkie

    Messages:
    12,531
    Joined:
    Jan 14, 2006
    I'm not saying they're going to succeed, I'm simply pointing out that they are TRYING.

    It's more effort than AMD has put into a BRAND NEW shipping product for END USERS in the last decade, and Nvidia does this at least once a year. Just because previous offerings don't succeed doesn't mean you stop trying to pitch new products. If all you do is sell the same thing you sold yesterday, you're doomed to fail.
     
  10. Rizen

    Rizen [H]ardForum Junkie

    Messages:
    9,220
    Joined:
    Jul 16, 2000
    AMD is doing a lot of work with VR and low level APIs which are - imo - of greater use to the market than a streaming Android TV box.
     
  11. Rizen

    Rizen [H]ardForum Junkie

    Messages:
    9,220
    Joined:
    Jul 16, 2000
    I'm not disagreeing with you here on the value of introducing new, innovative products into a market that is relatively stagnant. However introducing products that fail doesn't help your bottom line, and I really can't see this product being a resounding success.

    I also think you are overlooking a lot of the things that AMD has done for the PC industry. APUs in general, Eyefinity, HSA computing, VR, low level APIs, as well as their consistently solid GPU technology. Branching off into set top boxes and game consoles given their cash problems isn't really a good strategy, that market already has fierce competition.
     
  12. durquavian

    durquavian Gawd

    Messages:
    757
    Joined:
    Dec 13, 2014
    It's probably better for AMD to concentrate on Zen than pushing out in other avenues. I'd rather they put effort into Zen than waste it on anything else.
     
  13. defaultluser

    defaultluser [H]ardForum Junkie

    Messages:
    12,531
    Joined:
    Jan 14, 2006
    Let me get something straight with you. Android handset SoC market is saturated. Since 2013 you can buy a MediaTek with 8 A7 cores, Mali 450 competent graphics and 3G for peanuts, and in 2014 they announced 8 A53 cores, 2k display, 4k h.265, LTE for tthe same price-point. Both Intel and Samsung are trying their hands at 4G integrated as well on their latest offerings, so even that is no-longer safe from poachers. Qualcomm is so up against the wall, they're now selling the Snapdragon 410 for that same peanuts price as MediaTek-et-al with 4 A53 cores and LTE.

    Did you hear that? LTE is NO LONGER A PREMIUM FEATURE, and ARM's basic cores are tweaked to the point where they're good enough that even Qualcomm is using them in most of their SoCs. This means THE UNSINKABLE Qualcomm's profits are going to tank over the next few years, because even when they release the fancy new Kryo, the demand will not be as strong as it was for the impressive Krait because people are satisfied with other makers.

    This is the moden Android world Nvidia is faced with.

    Now, Nvidia got forced-out of the x86 world (no chipset license, no x86 license), and Windows RT was still-born, so trying to grow Tegra on Windows is out. That leaves them just one tangible option: carve out a space for themselves in the busy Android world. They either innovate or they exit the market.


    So, the only way to grow in the Android world it to carve-out a potential growth space, and since they have GPU prowess, and because the latest console generation are not exactly the overpowering behemoths they were last-generation, Nvidia sees an opportunity in consoles. The X1 is several times more powerful than the 360 it's price-competitive with, and likely almost as powerful as the One. It's a surprising gap left by the current console leaders who wanted to play-it-safe this generation.

    Yeah, this has "already been done" by lots of players, but nobody up-until now had really put this much power and this much industry support behind a microconsole. It may be enough to succeed where others have failed.

    Like what?

    An obvious end for chipset-integrated graphics, which were introduced by players like SIS in the 90s to reduce costs. In fact, both AMD and Intel were WAAAYY behind the times here, as Cyrix already released the MediaGX with integrated VGA graphics on-chip in 1997!.

    Just like integrating the memory controller, this was just a question of "when" rather than something that required some sort of amazing technological breakthrough.

    Intel released Sandy Bridge on the same day as AMD released Brazos (AMD was not exactly leading the way with on-chip graphics here), and some would say that Intel did a better job with on-die graphics in their first attempt, as they allowed the CPU and GPU to both access the L3 cache.

    AKAIK even Kaveri hasn't fixed the memory bus disparity yet, and it's also not HSA 1.0 compliant (have to wait for Carrizo). That's a helluva wait for a ":killer" feature AMD's been harping on about.

    /me still waiting for that killer app that's massively accelerated by HSA

    Botched the launch because they wanted to surprise Nvidia. Yeah, they caught Nvidia off-guard for two years, but they also caught themselves off-guard. Some major failings that made Eyefinity weak:

    1. No support for passive DisplayPort to DVI/HDMI/VGA adapters outside of 2 displays, because they didn't want to take the time to alter the GPU mask to add a third DVI clock generator. This made it tons of fun with lots of threads asking "how do I get three displays working?" This was a real problem because there were were almost no DisplayPort displays available at-launch.

    Took them a year to release the affordable active adapters they should have had at-launch, and by that time Nvidia had countered with Surround via SLI.

    Took them until the 290 and 260 series (4 years) to add more than two DVI clocks to the die. Nvidia had four DVI clocks on the die when they officially added DisplayPort with the 680/640 launch, 2 years after Eyefinity.

    2. Displayport and DVI chips used different clocks, resulting in tearing between displays. I think this required a hardware rev to fix, so who knows if it was ever fixed. Not something that was well thought-out before they gifted this problem child to the world, eh?

    http://support.amd.com/en-us/kb-articles/Pages/Screentearingwithmultipledisplays.aspx


    That have never shipped in non-beta form. I'm talking about stuff people actually see, because if people don't see your name on something, YOU DON'T EXIST.
     
    Last edited: Mar 4, 2015
  14. daglesj

    daglesj [H]ardness Supreme

    Messages:
    5,096
    Joined:
    May 7, 2005
    Whoopee! VR (again...like crappy 3D cinema, never goes away) and low level APIs (yawnsville for 99.999% of the worlds population)

    Aint gonna make them money or raise their profile one iota.

    Amazed that so many AMD fans just want them to stay hidden in a cupboard.*

    "AMD is my wittle sekwet!":rolleyes:





    *I'm an AMD fan but I'm one that would like AMD to prosper and become a major force in modern computing rather than a obscure near forgotten footnote in computing history.
     
    Last edited: Mar 4, 2015
  15. durquavian

    durquavian Gawd

    Messages:
    757
    Joined:
    Dec 13, 2014
    You are quite the pessimist. So much negativity. Your opinion does not necessarily reflect every other user. AMD has done quite well in the last year or two reeling in their marketing and keeping quite silent until release. I like their focus. And low level APIs aren't really yawnsvile seeing DX is becoming closer to one with 12. It all starts somewhere. No matter how small or wasteful it may seem, you have to remember alot of great ideas and innovation came from mistakes or when another outcome was the goal.
     
  16. fixedmy7970

    fixedmy7970 Gawd

    Messages:
    524
    Joined:
    Jul 20, 2013
    Yall intel fanbois are getting offtopic. Carrizo shows that AMD can take a power hungry architecture and optimize it very well. If AMD was releasing cpus' at 14nm like intel, things would be very different...

    2016 my friends, AMD is going to put Intel in a world of hurt. 14nm Zen.
     
  17. daglesj

    daglesj [H]ardness Supreme

    Messages:
    5,096
    Joined:
    May 7, 2005
    Well I hope that's the case...but I'm not going to hold my breath.


    When it comes to bring an AMD fan it's better to prepare for disappointment. It hurts less when it happens.
     
  18. Lunas

    Lunas [H]ardForum Junkie

    Messages:
    9,851
    Joined:
    Jul 22, 2001
    Yeah i just don't want it to be bulldozer again.
     
  19. Trimlock

    Trimlock [H]ardForum Junkie

    Messages:
    15,157
    Joined:
    Sep 23, 2005
    Exactly and I think anyone rooting for AMD with Zen with be cautiously optimistic for it as well.

    I'm aiming for competitive.
     
  20. Lunas

    Lunas [H]ardForum Junkie

    Messages:
    9,851
    Joined:
    Jul 22, 2001
    i think it will be 1-2 steps behind and maintain the place they are at right now but i dont think it will surge ahead or be competitive in areas other than price. i hope im wrong and between hsa and the voltage adaptive clock speed thing they are introducing with carrizo they step up to intel deck them in the face and then zen is step beyond intel...
     
  21. griff30

    griff30 I Lower the Boom!

    Messages:
    5,399
    Joined:
    Jul 15, 2000

    [​IMG]

    I'm a fan of AMD but I'm not delusional.
     
  22. Pieter3dnow

    Pieter3dnow [H]ardness Supreme

    Messages:
    6,789
    Joined:
    Jul 29, 2009
    So what did Sony and MS exactly develop hardware wise then? AMD made sure they had a SOC which did most of the stuff that they needed to do after that they could not screw it up.

    And your analogy of the tech not being exciting as the last gen tech was really stupid since that was outsold by the WIi and for the most part the Xbox360 and PS3 were claiming 1080p but failed on coming even close. Regardless of the technical issues some had/have.

    I'm not to sure that you understand what this is all about. Check how much Intel loses on tablets and phones then come back to me about statements on profit on consoles.

    And all the people that know about AMD do know what they are buying you are suggesting that people can change their opinion on things as colour or shape which they find attractive and base their spending on that.
     
  23. Pieter3dnow

    Pieter3dnow [H]ardness Supreme

    Messages:
    6,789
    Joined:
    Jul 29, 2009
    When you pretend that AMD can spend money on branding and then everything will turn around when showing that Apple desktop pc still not selling as hotcakes proves my point. If you double your marketshare still not proving the point that branding is everything.

    You have to be able to have a market to sell things, PS4 and Xbox1 are real that is no pipedream it is putting bread on the table for AMD. Chasing pipedreams and hoping that the SEC or other authorities are making sure of a level playing field on the PC market is not going to happen. When you are stuck between a rock and a hard place AMD is doing "well".
     
  24. Pieter3dnow

    Pieter3dnow [H]ardness Supreme

    Messages:
    6,789
    Joined:
    Jul 29, 2009
    Thread crapping in AMD cpu section good job.
     
  25. Lunas

    Lunas [H]ardForum Junkie

    Messages:
    9,851
    Joined:
    Jul 22, 2001
    yes yes amd did well not screwing up in the consoles big deal they still are between volumetric discounts and wholesale are not making heaps of cash on those chips and considering the flash sale of the released consoles is over and they are only seeing a decline in demand they are getting enough profit from that to keep other sectors open but they are shrinking as a company loosing people left and right here and they have little to no mainstream market presence.

    what is amd loosing on tablets and phones? R&D costs as they are trying to break into a market intel and Nvidia are struggling in Nvidia is better off as it seems they are gaining traction on arm amd is doing nothing but playing catch up and they don't have the money for gas to get them there.
     
  26. daglesj

    daglesj [H]ardness Supreme

    Messages:
    5,096
    Joined:
    May 7, 2005
    The consoles are probably the only thing thats stopping the banks from closing AMD down. The fact that AMD is a critical supplier for three major companies gives them some protection from the banks just calling it quits.

    The consoles are not making AMD money, they are just keeping AMD for the time being listed as 'viable'.
     
  27. daglesj

    daglesj [H]ardness Supreme

    Messages:
    5,096
    Joined:
    May 7, 2005
    The fact is that for every single Mac Apple sells it makes 10 times as much as say HP or Acer does for each of its PCs. It only has to sell at a ratio of 1:10 of the other manufacturers. Acer has to push a hell of a lot of those $300 boxes (probably with AMD E1 cpus in them yay) to keep up.

    They have a product people are prepared to pay a premium for...now that's branding. Thats good marketing.

    It's how the likes of Aston martin and Ferrari get people to pay $200000 for a car rather than $20000 on a Hyundai.

    Unfortunately, AMD has NO brand awareness outside a few tech forums.

    "Why would I buy AMD? Never heard of them! Oh that's got an Intel chip in it..heard of them..bing bing bong bing!"


    At the end of the day most of us here posting are big AMD fans and would love to see them back at the point they were around 2003/2004 but the fact is many of us here can reflect on the past and whats happened recently and have a pretty realistic view of the future. However, one or two here are living in some strange fantasy world where they think AMD's current strategy (if you can call it that) will somehow turn them into a true tech giant everyone wants to buy.

    Even if somehow AMD did manage to get some kind of a lead on Intel...how long you think it would last before they squandered it like last time?

    It's just sad and frustrating. It's like having a best friend that becomes a junkie or a bad drunk, you pick them up and give them the benefit of the doubt for a while but after having to mop up the puke for the third time you do begin to wonder if it's all worth it.
     
    Last edited: Mar 6, 2015
  28. defaultluser

    defaultluser [H]ardForum Junkie

    Messages:
    12,531
    Joined:
    Jan 14, 2006
    Nope, just pointing out the many companies that have spent their money more wisely than AMD. Nvidia is a particularly good choice because they are companies with similar size (up until AMD stated imploding two years ago).

    Another company that is like AMD was prior to 2008 (when they had fabs) is Sandisk. While they USED to be a pure OEM component supplier, they smartly pushed flash expansion chips in the massively growing smart phone market by creating MiniSD and the MicroSD. They released their own line of MP3 players to offer an alternative to Apple (and to guarantee higher-profit margins for their flash chips, and more sales of MicroSD expansion). Their vertical integration meant they could charge LESS than Apple while still making a sizeable profit, gaining them about 20% of the market near the end.

    They also branched-out into SSDs, making another name for themselves in the high-performance, high-reliability field. These are more consumer-facing product lines than AMD has EVER attempted to offer, and unlike AMD ram and crap like that, they ACTUALLY DESIGNED AND MANUFACTURED these devices. That is how you run a 6 billion-a-year company that really only makes one product - increase your brand awareness and margins...instead of say, dropping 6 billion on a pipe dram?

    AMD is doomed. The console dollars are not enough to absorb the massive penalties they will have to continue to pay because they can't make enough orders with GF (the reason for their last two hundreds-of-millions losses quarters). One can only hope they are smart enough to spin-off ATI yo a company that can handle the responsibility, so they can fade away without hurting enthusiasts anymore.

    I simply point-out all these other successful companyies, because it's more entertaining than watching AMD slowly bleed to death :(
     
    Last edited: Mar 6, 2015
  29. fixedmy7970

    fixedmy7970 Gawd

    Messages:
    524
    Joined:
    Jul 20, 2013
    yall are such a pessimistic bunch. AMD has finally hired some good engineers, and these products are about to be released -> yall say "OMG AMD BLEEDING TO DEATH"

    maybe, but that's about to change, just like how the market is changing. AMD can use this to their advantage. Personally, I think a custom chip maker is a good prediction of future markets that will demand specialized performance
     
    Last edited: Mar 6, 2015
  30. daglesj

    daglesj [H]ardness Supreme

    Messages:
    5,096
    Joined:
    May 7, 2005
    Well...we'll see.


    (Braces for usual disappointment)
     
  31. durquavian

    durquavian Gawd

    Messages:
    757
    Joined:
    Dec 13, 2014
    I wasn't at all disappointed in my 8350. Definitely faster than my previous 965BE. And so far seems quite smoother than many other computers I have worked on, AMD and Intel. I find it funny, and sad, that so many in these forums,claiming vast knowledge, seem far too eager to further perpetuate ignorance and facts far out of context.

    Bulldozer upon released performed poorly, not because it was a bad architecture, at least not solely for that reason, but because it was a completely new architecture that not a single program or OS was setup to utilize. Years later the same bulldozers are performing far better, not because the morphed into new better performing chips but because they finally got optimizations in software and OS's.

    For the great deal of users in the world the performance differences between in Intel and AMD are indistinguishable, even with the fab levels. Now for extreme setups, which become costly, intels superiority becomes evident, but that percentage is extremely small as a total of number of users. Yet if one were to hang on the words of some posters, one would think AMD would be like running the original IBM personal computers, which we know it is NOT.
     
  32. defaultluser

    defaultluser [H]ardForum Junkie

    Messages:
    12,531
    Joined:
    Jan 14, 2006
    One important angle you missed there. Bulldozer was designed for massive desktop workstations in an era where people like their workstations to be more portable.

    This thing has a faster processor than Vishera,

    http://www.engadget.com/2012/06/13/apple-macbook-pro-with-retina-display-review/

    http://browser.primatelabs.com/geekbench2/search?page=40&q=AMD+FX-8350

    That processor is faster than the best 125w Vishera, the 8350, and only sips 45w (with iGPU too). And because people love sexy portable power, they're willing to pay a premium. And that's where the money is!

    And since laptop sales eclipsed desktop sales years ago, you can pretty-much see where this is going. AMD needs an efficient architecture in 2016, or they are out. They also need to stay solvent until 2016, which is also questionable (total assets have dropped from 13 billion in 2006 to just 3 billion in 2014).
     
    Last edited: Mar 6, 2015
  33. Lunas

    Lunas [H]ardForum Junkie

    Messages:
    9,851
    Joined:
    Jul 22, 2001
    You missed that amd needs someone willing to put the apu into a laptop and I have found very few oems who have models with amd as an option it is all celeron and i3 with high end having i5 and i7
     
  34. durquavian

    durquavian Gawd

    Messages:
    757
    Joined:
    Dec 13, 2014
    And thank you for posting proof of the very point I made. You did not disprove what I said and proved that you are one of those that serve nothing but ignorance.

    First my PC will blow away that Apple laptop, one because I don't run stock nor am I limited by cooling nor form-factor. But that was never the point I was making, just your inane attempt at deflection.

    And secondly, you like so many others in the habit of blurring the truth, leave out certain details like:

    AMDs assets in 2006 were at the point as Intel ILLEGALLY bribed OEMs to not use AMD hardware and Bribed software to use their ICC compilers which flagged certain CPUs to use slower libraries to ensure Intel was the leading performer even though factually it was not. By doing so it limited sales and growth to the point that AMD was forced to sell assets to be able to maintain operation. Which by today was the main and largest reason for AMDs current position although not the sole reason.

    And because you missed the point by a country mile:

    For the MAJORITY of users, they will not be able to ascertain a difference between Intel and AMD, other than cost, until you start talking of high-end or businesses where software is optimized for Intel use.

    HIGH-END= Multi-GPU setups aimed at 120hz or greater game play, generally cost >$2000.

    Now this is just in regards to performance. Power use is definitely in Intels favor and if that is of concern, and in a great deal of countries and areas it is, then Intel is the solution. But it doesn't change the original point.

    So I guess I will just say Try Again.
     
  35. fixedmy7970

    fixedmy7970 Gawd

    Messages:
    524
    Joined:
    Jul 20, 2013
    ^ brahs you're both right.

    even NVidia recently came out and said something to the effect that APU's are largely good enough.

    But yea, AMD really needs a solid architecture for 2016. It doesn't have to be better than intel, but imo as long as its as power efficient with good performance gains, I think amd will be just fine
     
  36. defaultluser

    defaultluser [H]ardForum Junkie

    Messages:
    12,531
    Joined:
    Jan 14, 2006
    The reason nobody uses AMD APUs is:

    They can't hit 15w with more than the performance to the Core i3 (let-alone the i5 and i7), and since it costs Core i3 prices to make a competitive APU, they have to charge at least as much as Intel to make ends meet. Price/performance fail. There's also no opportunity for upsell if the best AMD parts are slower than Intel. One-and-done is not how the notebook industry works, and stocking an AMD chip and motherboard is an extra cost that makes no sense when they can just sell an i3 instead.

    Intel in the last four years has shifted the industry from the normal notebook wattage being 25-35w for mainstream platforms and 45w for high-end platforms, down to 15w for all mainstream and 35-45w for high-end platforms. OEMs standardize ENTIRE DESIGNS now for that 15w power point (both to standardize processor inventory and things like coolers across multiple lines to reduce costs), so if AMD can't hit it, they lose the design win.

    So AMD could compete on the high-end, but since most of those are sold as workstations or gaming systems, they have to compete with the Core i7 quad, which it can't. And the powerful integrated graphics is a liability at this range anyway, because most of those 45w systems ship with a discrete gpu.

    So, where do they make sales in the premium part of the spectrum? Oh, I guess they don't. It's not some great conspiracy against them: the OEMs have been struggling to get people to buy new notebooks now that the market and performance has saturated, so they are trying lower-power, sleeker and more portable devices. It is this portion of the market that AMD can't even begin to compete in (and their mobile discrete GPUs have been a laughingstock for about the last four years as well), so yes they are going to keep losing money.

    AMD had it a lot easier when most people just bought desktops. But times change, and companies that can't change with the times are doomed to failure.
     
    Last edited: Mar 6, 2015
  37. daglesj

    daglesj [H]ardness Supreme

    Messages:
    5,096
    Joined:
    May 7, 2005

    Tell me about it. If a customer brings a perfectly good condition 3-4 year old i3/i5 laptop to me asking if they should buy a new one I always say no.

    Just needs a SSD and maybe a bit more ram and its good to go for another 4 years. I'm doing 3-4 laptops a week with this.

    Folks don't need new laptops. Especially when most of the cheap laptops are now junk. Two to three years ago you could count on at least a proper dual core Pentium or Athlon. Now it's ultra low power Atom/E1 crap.
     
  38. durquavian

    durquavian Gawd

    Messages:
    757
    Joined:
    Dec 13, 2014
    Actually reading what I could find of Carrizo, it looks pretty good. It has the power savings needed to compete with current Intel in the sub 15W market. And for those that still spout IPC, as a few users mentioned, at that level IPC isn't a big issue unless you are doing CPU intensive tasks, most don't. There is a great deal of hope with this release. I just want to see some hands on to see what the real world performance and power use is.
     
  39. griff30

    griff30 I Lower the Boom!

    Messages:
    5,399
    Joined:
    Jul 15, 2000
    It's like being a Browns fan.
    You can be optimistic but seriously, when is the last time AMD beat Intel on performance?
     
  40. daglesj

    daglesj [H]ardness Supreme

    Messages:
    5,096
    Joined:
    May 7, 2005
    You have to admire some people's optimism but sometimes it does look as though they live in some fantasy land/alternate universe on occasion.

    I guess those kind of people just won't be swayed to take a more realistic view of things.