AMD Shows Off Next Gen 28nm GPU

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,534
Just got this PR in from AMD about mobile 28nm below, it is contained below in full. However this begs the question of whether or not AMD will have a next-gen 28nm GPU this year. The information I am getting directly from inside the industry is, "likely not due to process issues at TSMC." If we do in fact see the code named "Southern Islands" GPU out of AMD this year, it will very likely be in the form of a high end part with limited availability. And yes, no GlobalFoundries GPU for AMD as of yet.

TAIPEI, Taiwan – October 5, 2011 – At Fusion 2011, AMD (NYSE:AMD) today demonstrated its next generation graphics processor, based on the cutting-edge 28 nm process technology. The demonstration was delivered by Corporate Vice President and General Manager of AMD’s Graphics Division, Matt Skynner, as part of his keynote titled, “Enabling the Best Visual Experience.” Skynner demonstrated a notebook-based version of AMD’s 28 nm next-generation graphics processor delivering a smooth, high-resolution game experience while playing Bioware’s popular role-playing title, Dragon Age 2.

“AMD strives to be at the forefront of every key inflection point in graphics technology, as demonstrated by our leadership in everything from process node transitions, to adoption of the latest graphics memory,” said Skynner. “Our pace-setting transition to the 28nm process node, coupled with new innovations in our underlying graphics architecture, is already generating excitement among the ODM community here in Taipei this week.”
 
Doubt it with all the rumors but would be fantastic to see a high end hd 7000 series this year. Kepler is what not even close, to be due out 6 months. Price will be high like the hd 5000 series.
 
Whatever happened to the integration of CPU/GPU that the merger was supposed to accomplish and was sold as "teh futurez"?

At the very least, I think it would be nice to have a polished hybrid graphics product like are available for laptops on my desktop.

I have three 6870s in my desktop, and it turns the computer into a toaster oven heating up the room even when I'm just browsing the desktop and making stupid posts on [H]ardforum.
 
Screw the "teh futurez" While it sounds enticing to have an all in one chip that handles everything, it seems limiting to people who want to build custom PC, or so I feel.

It is good for the smaller stuff like laptops and tablets though, just wish they don't suck or cost way too much which is likely.
 
I have three 6870s in my desktop, and it turns the computer into a toaster oven heating up the room even when I'm just browsing the desktop and making stupid posts on [H]ardforum.

Do you have a mental disability?
 
Screw the "teh futurez" While it sounds enticing to have an all in one chip that handles everything, it seems limiting to people who want to build custom PC, or so I feel.

It is good for the smaller stuff like laptops and tablets though, just wish they don't suck or cost way too much which is likely.
(missed d in sandy bridge in last post)

All on one chip will reduce overall system cost by needing less silicon, PCB's, and necessary SMT's to get each board working.

Also it should increase performance since everything will be integrated on one die, bus transfers and communication will be direct rather than through some other intermediary chipset. It's the same concept as on-die memory controllers. C'mon man.
 
(missed d in sandy bridge in last post)

All on one chip will reduce overall system cost by needing less silicon, PCB's, and necessary SMT's to get each board working.

Also it should increase performance since everything will be integrated on one die, bus transfers and communication will be direct rather than through some other intermediary chipset. It's the same concept as on-die memory controllers. C'mon man.

I can see that, but I can also see them doing stupid things like what laptops are doing now where they give you great CPU power but limited integrated graphic like power and you end up with, figuratively speak a right hand twice the size of the left. You lose choice. If they integrated and allow further power by an extra GPU, I'm fine with that.... but then they might also make it so that it completely shuts off and disregard the one that is integrated then, you end up with an extra you don't use but forced to pay for.
 
Just wanted to state, that I haven't really looked too much into Sandy Bridge and recent processors, so if I'm wrong about anything, feel free to correct me. As I'm sure you will.:D

Most of what I know I've heard from friends on a limited basis.
 
I can see that, but I can also see them doing stupid things like what laptops are doing now where they give you great CPU power but limited integrated graphic like power and you end up with, figuratively speak a right hand twice the size of the left. You lose choice. If they integrated and allow further power by an extra GPU, I'm fine with that.... but then they might also make it so that it completely shuts off and disregard the one that is integrated then, you end up with an extra you don't use but forced to pay for.
As it stands today, Llano is a great step forward in GPU performance at low price points. It reuses old CPU architecture (as it is already more than fast enough for most laptop user/uses) and integrates a low powered Radeon GPU with it.

Llano has 400 shader processors, half what my 4870 has, remember that the 4870 was a high end card just a couple generations ago. Which is especially remarkable when you consider that other IGP's (not on-die) were coming with 40-80 shader processors. And even those were offering better performance relative to what IGP's were putting out before that... Cough cough Intel Integrated "Extreme" Graphics... I remember playing Halo 1 on a system with that chipset where there was no hardware T&L (IIRC) in 2002/3. There was no foliage, textures just looked like smeared/blurred color with no features or definition, and at 640x480 it still maxxed out at 15fps. /rant

This means somewhat new games are now playable on the cheapest of laptops.

But of course, there is no replacing a dedicated GPU. The power envelope of a flagship CPU & GPU on one die would be terrifying. It'd be damn near impossible to cool unless the surface area of the die was significantly increased... Which isn't likely with 28nm lithography or better.

AMD is transitioning to a future where the GPU and CPU are not even separate circuits on the same chip. They're blurring the lines; it will be one chip that does both parallel (graphics) and general tasks. But the market for add in cards to give us extra oomph will never be gone, as long as software keeps getting more advanced.
 
As it stands today, Llano is a great step forward in GPU performance at low price points. It reuses old CPU architecture (as it is already more than fast enough for most laptop user/uses) and integrates a low powered Radeon GPU with it.

Llano has 400 shader processors, half what my 4870 has, remember that the 4870 was a high end card just a couple generations ago. Which is especially remarkable when you consider that other IGP's (not on-die) were coming with 40-80 shader processors. And even those were offering better performance relative to what IGP's were putting out before that... Cough cough Intel Integrated "Extreme" Graphics... I remember playing Halo 1 on a system with that chipset where there was no hardware T&L (IIRC) in 2002/3. There was no foliage, textures just looked like smeared/blurred color with no features or definition, and at 640x480 it still maxxed out at 15fps. /rant

This means somewhat new games are now playable on the cheapest of laptops.

But of course, there is no replacing a dedicated GPU. The power envelope of a flagship CPU & GPU on one die would be terrifying. It'd be damn near impossible to cool unless the surface area of the die was significantly increased... Which isn't likely with 28nm lithography or better.

AMD is transitioning to a future where the GPU and CPU are not even separate circuits on the same chip. They're blurring the lines; it will be one chip that does both parallel (graphics) and general tasks. But the market for add in cards to give us extra oomph will never be gone, as long as software keeps getting more advanced.

The old 645/915/945 (the last being otherwise known as the GMA 950) were DX7 compliant GPU :p

Because MS was lax in what really meant DX9, Intel claimed DX9 support (blame HW caps). At least DX10 did away with the HW caps :)
 
The old 645/915/945 (the last being otherwise known as the GMA 950) were DX7 compliant GPU :p

Because MS was lax in what really meant DX9, Intel claimed DX9 support (blame HW caps). At least DX10 did away with the HW caps :)

I meant "845/915/945"

also, they indeed didn't have TnL. Though the game running slow was more due to audio drivers, lol. I've been in your situation before, trying to play halo PC on an old Intel Integrated IGP "Intel Extreme Edition 2" or whatever it was called.
 
The old 645/915/945 (the last being otherwise known as the GMA 950) were DX7 compliant GPU :p

Because MS was lax in what really meant DX9, Intel claimed DX9 support (blame HW caps). At least DX10 did away with the HW caps :)
I looked it up after posting-

That system had an 845 chipset, the "Extreme Graphics" die on that being technically different than what the GMA chips had. T&L was supported in software emulation, aka still slow as hell. I don't remember any games that ran faster than 20fps unless they were 3-4 years older than that system. Or like in UT2k3/4 you could lower the resolution all the way down to 320x240. :mad:
 
I meant "845/915/945"

also, they indeed didn't have TnL. Though the game running slow was more due to audio drivers, lol. I've been in your situation before, trying to play halo PC on an old Intel Integrated IGP "Intel Extreme Edition 2" or whatever it was called.
Wasn't it the worst gaming days of your life?

Bought that system in 02/03, wanted to upgrade the graphics and found out it didn't even have an AGP system. It still couldn't even run Quake 3 smoothly at decent settings. :(
 
:confused:

Sandy Brige and beyond
Llano, Zambezi and beyond

Wake up and pay attention in class
Those are very weak graphics though, and unlike laptops, I wasn't aware that you could have hybrid graphics setup so your dedicated "hardcore" GPUs aren't constantly wasting juice and heating up your mom's basement (hence the comment like "hybrid graphics like on my laptop").

And yeah, good catch on the brainfart. I had three 4870s (4870x2 and 4870) and replaced it with 6850s actually... sorry, late + alcohol + stupid + no edit button = herping the derp. :)
 
AMD needs to step it up a bit. Showing of tech like this just shows how much they are hurting.

If bulldozer doesn't come through for them, they are going to be in a (bigger) world of hurt.
 
Those are very weak graphics though, and unlike laptops, I wasn't aware that you could have hybrid graphics setup so your dedicated "hardcore" GPUs aren't constantly wasting juice and heating up your mom's basement (hence the comment like "hybrid graphics like on my laptop").

And yeah, good catch on the brainfart. I had three 4870s (4870x2 and 4870) and replaced it with 6850s actually... sorry, late + alcohol + stupid + no edit button = herping the derp. :)
Weak is relative.

Llano veritably nukes everything else at 45W (note: CPU AND GPU) and its price points. No it's not going to hold a candle to a 150W+ dedicated GPU. But it offers darn near mid range desktop performance for inexpensive laptops. It's a substantial leap forward for the average user.

Plus it has a true DX11 feature set, which as mentioned plagued Intel chipsets for years. "Our chipsets are DX9 compatible! Everything DX9 will run on it, it just doesn't have any of the DX7/8/9 things you want." The 845 chipset I brought up was, strictly, fully compatible with up to DX6 when Microsoft was releasing DX9 and SM2.0, aka real shaders that actually started making real time graphics look somewhat realistic. That's the hole that AMD/ATi have dug us out of. Today you can go buy an A-series laptop and play new games on acceptable settings with acceptable performance. 10 years ago if you bought a mid range desktop you couldn't run squat on it unless you installed your own graphics card. I had a friend that bought a Sony Vaio about 10 years ago, expensive system then, that only had a GeForce 2, when GF4 TI4600's and 9700Pros were the king of the hill.

If AMD carries through on Fusion, you'll be able to buy one chip and play most games at medium settings, maybe better. To combine the performance of a respectable mid range discrete graphics card and traditional CPUs.
 
If AMD carries through on Fusion, you'll be able to buy one chip and play most games at medium settings, maybe better. To combine the performance of a respectable mid range discrete graphics card and traditional CPUs.
I really hope so, AMD needs to have an advantage on one of the industry fronts at least.
 
Screw the "teh futurez" While it sounds enticing to have an all in one chip that handles everything, it seems limiting to people who want to build custom PC, or so I feel.

It is good for the smaller stuff like laptops and tablets though, just wish they don't suck or cost way too much which is likely.

AMD's current fusion chips are pretty decent for the TDP envelopes.
 
I am hoping that the new GPUs will have added support for 10-bit x264 video decode instead of just 8-bit. Also I really would like the next set of GPUs to be an upward shift in performance instead of a seemingly horizontal shift.
 
i was expecting 7970 to have more streaming cores than that.

but if the price is right then thats a different story.
 
That's 30% more than the current top single gpu card. A hefty increase. I agree with nVidia though, we've got plenty of processing power for shaders. We need better geometry to make things look better. Less flat, featureless walls, more facets and detail is what's going to make things look appreciably better.
 
Whatever happened to the integration of CPU/GPU that the merger was supposed to accomplish and was sold as "teh futurez"?

At the very least, I think it would be nice to have a polished hybrid graphics product like are available for laptops on my desktop.

I have three 6870s in my desktop, and it turns the computer into a toaster oven heating up the room even when I'm just browsing the desktop and making stupid posts on [H]ardforum.

Not everybody wants to use an AMD cpu, so yeah, you still gotta make 'em separate so people have more choice.

If AMD didn't do this, Nvidia would rule the world.
 
dammit, i am stuck on a 4850 512mb card with my i5 rig and 16G of ram.. i wanted next gen...should i settle for a 6 series i guess
 
That's 30% more than the current top single gpu card. A hefty increase. I agree with nVidia though, we've got plenty of processing power for shaders. We need better geometry to make things look better. Less flat, featureless walls, more facets and detail is what's going to make things look appreciably better.

What? Are you saying no more console ports??? Spread the word! The answer has been found! (I can dream can't I? )
 
Back
Top