AMD Shows Off Next Gen 28nm GPU

Discussion in 'HardForum Tech News' started by FrgMstr, Oct 5, 2011.

  1. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,389
    Joined:
    May 18, 1997
    Just got this PR in from AMD about mobile 28nm below, it is contained below in full. However this begs the question of whether or not AMD will have a next-gen 28nm GPU this year. The information I am getting directly from inside the industry is, "likely not due to process issues at TSMC." If we do in fact see the code named "Southern Islands" GPU out of AMD this year, it will very likely be in the form of a high end part with limited availability. And yes, no GlobalFoundries GPU for AMD as of yet.

     
  2. kcmastrpc

    kcmastrpc [H]ard|Gawd

    Messages:
    1,238
    Joined:
    May 26, 2011
    but will it run Rage?
     
  3. ripken204

    ripken204 [H]ard|Gawd

    Messages:
    1,427
    Joined:
    Aug 24, 2007
    please tell me this isn't the new thing...
     
  4. kcmastrpc

    kcmastrpc [H]ard|Gawd

    Messages:
    1,238
    Joined:
    May 26, 2011
    no, actually, i'm trolling kyle.
     
  5. Namesis

    Namesis [H]Lite

    Messages:
    88
    Joined:
    Jan 12, 2011
    Doubt it with all the rumors but would be fantastic to see a high end hd 7000 series this year. Kepler is what not even close, to be due out 6 months. Price will be high like the hd 5000 series.
     
  6. jeremyshaw

    jeremyshaw [H]ardForum Junkie

    Messages:
    12,073
    Joined:
    Aug 26, 2009
    I expect to see at least a 28nm mobile part by the end of the year :)
     
  7. Chihlidog

    Chihlidog [H]ard|Gawd

    Messages:
    1,163
    Joined:
    May 3, 2011
    Looking forward to this.
     
  8. Ducman69

    Ducman69 [H]ardForum Junkie

    Messages:
    10,445
    Joined:
    Jul 12, 2007
    Whatever happened to the integration of CPU/GPU that the merger was supposed to accomplish and was sold as "teh futurez"?

    At the very least, I think it would be nice to have a polished hybrid graphics product like are available for laptops on my desktop.

    I have three 6870s in my desktop, and it turns the computer into a toaster oven heating up the room even when I'm just browsing the desktop and making stupid posts on [H]ardforum.
     
  9. L0s7 4 Lyf3

    L0s7 4 Lyf3 Limp Gawd

    Messages:
    454
    Joined:
    Mar 3, 2004
    :confused:

    Sandy Brige and beyond
    Llano, Zambezi and beyond

    Wake up and pay attention in class
     
  10. Serpent

    Serpent 2[H]4U

    Messages:
    2,106
    Joined:
    Jun 19, 2011
    Screw the "teh futurez" While it sounds enticing to have an all in one chip that handles everything, it seems limiting to people who want to build custom PC, or so I feel.

    It is good for the smaller stuff like laptops and tablets though, just wish they don't suck or cost way too much which is likely.
     
  11. Jesse B

    Jesse B [H]ard|Gawd

    Messages:
    1,631
    Joined:
    May 30, 2010
    Do you have a mental disability?
     
  12. L0s7 4 Lyf3

    L0s7 4 Lyf3 Limp Gawd

    Messages:
    454
    Joined:
    Mar 3, 2004
    (missed d in sandy bridge in last post)

    All on one chip will reduce overall system cost by needing less silicon, PCB's, and necessary SMT's to get each board working.

    Also it should increase performance since everything will be integrated on one die, bus transfers and communication will be direct rather than through some other intermediary chipset. It's the same concept as on-die memory controllers. C'mon man.
     
  13. fattypants

    fattypants 2[H]4U

    Messages:
    3,284
    Joined:
    Mar 3, 2010
    My wallet is ready
     
  14. TwistedAegis

    TwistedAegis [H]ardForum Junkie

    Messages:
    8,958
    Joined:
    Oct 7, 2009
  15. Serpent

    Serpent 2[H]4U

    Messages:
    2,106
    Joined:
    Jun 19, 2011
    I can see that, but I can also see them doing stupid things like what laptops are doing now where they give you great CPU power but limited integrated graphic like power and you end up with, figuratively speak a right hand twice the size of the left. You lose choice. If they integrated and allow further power by an extra GPU, I'm fine with that.... but then they might also make it so that it completely shuts off and disregard the one that is integrated then, you end up with an extra you don't use but forced to pay for.
     
  16. Serpent

    Serpent 2[H]4U

    Messages:
    2,106
    Joined:
    Jun 19, 2011
    Just wanted to state, that I haven't really looked too much into Sandy Bridge and recent processors, so if I'm wrong about anything, feel free to correct me. As I'm sure you will.:D

    Most of what I know I've heard from friends on a limited basis.
     
  17. L0s7 4 Lyf3

    L0s7 4 Lyf3 Limp Gawd

    Messages:
    454
    Joined:
    Mar 3, 2004
    As it stands today, Llano is a great step forward in GPU performance at low price points. It reuses old CPU architecture (as it is already more than fast enough for most laptop user/uses) and integrates a low powered Radeon GPU with it.

    Llano has 400 shader processors, half what my 4870 has, remember that the 4870 was a high end card just a couple generations ago. Which is especially remarkable when you consider that other IGP's (not on-die) were coming with 40-80 shader processors. And even those were offering better performance relative to what IGP's were putting out before that... Cough cough Intel Integrated "Extreme" Graphics... I remember playing Halo 1 on a system with that chipset where there was no hardware T&L (IIRC) in 2002/3. There was no foliage, textures just looked like smeared/blurred color with no features or definition, and at 640x480 it still maxxed out at 15fps. /rant

    This means somewhat new games are now playable on the cheapest of laptops.

    But of course, there is no replacing a dedicated GPU. The power envelope of a flagship CPU & GPU on one die would be terrifying. It'd be damn near impossible to cool unless the surface area of the die was significantly increased... Which isn't likely with 28nm lithography or better.

    AMD is transitioning to a future where the GPU and CPU are not even separate circuits on the same chip. They're blurring the lines; it will be one chip that does both parallel (graphics) and general tasks. But the market for add in cards to give us extra oomph will never be gone, as long as software keeps getting more advanced.
     
  18. MrLonghair

    MrLonghair [H]ard|Gawd

    Messages:
    1,764
    Joined:
    Jun 7, 2004
    Are we talking 7k series here, or the 28mm GPU seen at IDF etc?
     
  19. jeremyshaw

    jeremyshaw [H]ardForum Junkie

    Messages:
    12,073
    Joined:
    Aug 26, 2009
    The old 645/915/945 (the last being otherwise known as the GMA 950) were DX7 compliant GPU :p

    Because MS was lax in what really meant DX9, Intel claimed DX9 support (blame HW caps). At least DX10 did away with the HW caps :)
     
  20. jeremyshaw

    jeremyshaw [H]ardForum Junkie

    Messages:
    12,073
    Joined:
    Aug 26, 2009
    I meant "845/915/945"

    also, they indeed didn't have TnL. Though the game running slow was more due to audio drivers, lol. I've been in your situation before, trying to play halo PC on an old Intel Integrated IGP "Intel Extreme Edition 2" or whatever it was called.
     
  21. L0s7 4 Lyf3

    L0s7 4 Lyf3 Limp Gawd

    Messages:
    454
    Joined:
    Mar 3, 2004
    I looked it up after posting-

    That system had an 845 chipset, the "Extreme Graphics" die on that being technically different than what the GMA chips had. T&L was supported in software emulation, aka still slow as hell. I don't remember any games that ran faster than 20fps unless they were 3-4 years older than that system. Or like in UT2k3/4 you could lower the resolution all the way down to 320x240. :mad:
     
  22. stiltner

    stiltner [H]ardForum Junkie

    Messages:
    10,623
    Joined:
    Mar 16, 2000
    And yet we still don't have Bulldozer.

    I AM DISAPPOINT :(
     
  23. L0s7 4 Lyf3

    L0s7 4 Lyf3 Limp Gawd

    Messages:
    454
    Joined:
    Mar 3, 2004
    Wasn't it the worst gaming days of your life?

    Bought that system in 02/03, wanted to upgrade the graphics and found out it didn't even have an AGP system. It still couldn't even run Quake 3 smoothly at decent settings. :(
     
  24. Ducman69

    Ducman69 [H]ardForum Junkie

    Messages:
    10,445
    Joined:
    Jul 12, 2007
    Those are very weak graphics though, and unlike laptops, I wasn't aware that you could have hybrid graphics setup so your dedicated "hardcore" GPUs aren't constantly wasting juice and heating up your mom's basement (hence the comment like "hybrid graphics like on my laptop").

    And yeah, good catch on the brainfart. I had three 4870s (4870x2 and 4870) and replaced it with 6850s actually... sorry, late + alcohol + stupid + no edit button = herping the derp. :)
     
  25. Red Falcon

    Red Falcon [H]ardForum Junkie

    Messages:
    9,930
    Joined:
    May 7, 2007
    AMD needs to step it up a bit. Showing of tech like this just shows how much they are hurting.

    If bulldozer doesn't come through for them, they are going to be in a (bigger) world of hurt.
     
  26. L0s7 4 Lyf3

    L0s7 4 Lyf3 Limp Gawd

    Messages:
    454
    Joined:
    Mar 3, 2004
    Weak is relative.

    Llano veritably nukes everything else at 45W (note: CPU AND GPU) and its price points. No it's not going to hold a candle to a 150W+ dedicated GPU. But it offers darn near mid range desktop performance for inexpensive laptops. It's a substantial leap forward for the average user.

    Plus it has a true DX11 feature set, which as mentioned plagued Intel chipsets for years. "Our chipsets are DX9 compatible! Everything DX9 will run on it, it just doesn't have any of the DX7/8/9 things you want." The 845 chipset I brought up was, strictly, fully compatible with up to DX6 when Microsoft was releasing DX9 and SM2.0, aka real shaders that actually started making real time graphics look somewhat realistic. That's the hole that AMD/ATi have dug us out of. Today you can go buy an A-series laptop and play new games on acceptable settings with acceptable performance. 10 years ago if you bought a mid range desktop you couldn't run squat on it unless you installed your own graphics card. I had a friend that bought a Sony Vaio about 10 years ago, expensive system then, that only had a GeForce 2, when GF4 TI4600's and 9700Pros were the king of the hill.

    If AMD carries through on Fusion, you'll be able to buy one chip and play most games at medium settings, maybe better. To combine the performance of a respectable mid range discrete graphics card and traditional CPUs.
     
  27. Unexploded

    Unexploded Limp Gawd

    Messages:
    264
    Joined:
    Mar 13, 2005
    Hoping to go SLI? :D
     
  28. Red Falcon

    Red Falcon [H]ardForum Junkie

    Messages:
    9,930
    Joined:
    May 7, 2007
    I really hope so, AMD needs to have an advantage on one of the industry fronts at least.
     
  29. Emission

    Emission [H]ardness Supreme

    Messages:
    4,175
    Joined:
    Dec 6, 2005
    AMD's current fusion chips are pretty decent for the TDP envelopes.
     
  30. Emission

    Emission [H]ardness Supreme

    Messages:
    4,175
    Joined:
    Dec 6, 2005
    And relatively inexpensive to boot.
     
  31. RogueKitsune

    RogueKitsune [H]Lite

    Messages:
    110
    Joined:
    Apr 5, 2011
    I am hoping that the new GPUs will have added support for 10-bit x264 video decode instead of just 8-bit. Also I really would like the next set of GPUs to be an upward shift in performance instead of a seemingly horizontal shift.
     
  32. Da1Nonly

    Da1Nonly Limp Gawd

    Messages:
    332
    Joined:
    Mar 5, 2011
    Waiting for the real 7000 series. Not this laptop stuff.
     
  33. L0s7 4 Lyf3

    L0s7 4 Lyf3 Limp Gawd

    Messages:
    454
    Joined:
    Mar 3, 2004
    http://www.guru3d.com/news/radeon-hd-7000-series-specs-leaked-/
     
  34. DualOwn

    DualOwn [H]ard|Gawd

    Messages:
    1,074
    Joined:
    Dec 30, 2009
    i was expecting 7970 to have more streaming cores than that.

    but if the price is right then thats a different story.
     
  35. L0s7 4 Lyf3

    L0s7 4 Lyf3 Limp Gawd

    Messages:
    454
    Joined:
    Mar 3, 2004
    That's 30% more than the current top single gpu card. A hefty increase. I agree with nVidia though, we've got plenty of processing power for shaders. We need better geometry to make things look better. Less flat, featureless walls, more facets and detail is what's going to make things look appreciably better.
     
  36. hardware_failure

    hardware_failure [H]ard|Gawd

    Messages:
    1,280
    Joined:
    Mar 21, 2008
    Looks good on paper... does that mean we will see it come out ~10 years later just like BD?
     
  37. fatrat

    fatrat [H]ardness Supreme

    Messages:
    6,838
    Joined:
    Aug 31, 2000
    Not everybody wants to use an AMD cpu, so yeah, you still gotta make 'em separate so people have more choice.

    If AMD didn't do this, Nvidia would rule the world.
     
  38. MrGuvernment

    MrGuvernment [H]ard as it Gets

    Messages:
    19,167
    Joined:
    Aug 3, 2004
    dammit, i am stuck on a 4850 512mb card with my i5 rig and 16G of ram.. i wanted next gen...should i settle for a 6 series i guess
     
  39. Dr. Righteous

    Dr. Righteous 2[H]4U

    Messages:
    3,137
    Joined:
    Aug 1, 2007
    Yikes.

    I bet those people who stood in line at BestBuy and place like that waiting for it to go on sale are PISSED with buggy title. $59.95 Yeah, right.
    I will pick it up when it is $10 in a few months.
     
  40. Dr. Righteous

    Dr. Righteous 2[H]4U

    Messages:
    3,137
    Joined:
    Aug 1, 2007
    What? Are you saying no more console ports??? Spread the word! The answer has been found! (I can dream can't I? )