AMD Next Gen confirmed for Q4 2011: Dual architecture split like HD6K series

Wow, sad news indeed if true. :( On the positive side, I don't foresee Rage, Skyrim, or Dead Island really stressing out high end setups. The only game that will stress the highest end systems out there will be Battlefield.

I play in eyefinity though, I was looking forward to the extra horsepower to be honest :(
 
Read really disheartening news today. Say what you will about fudzilla but they were very accurate with the GTX 500 and Radeon 6800 and 6900 launches. It seems both TSMC and GLOFo are having tougher than usual times getting good results from their 28nm processes.

28 nm not in great shape
http://www.fudzilla.com/home/item/23909-28nm-not-in-great-shape

To make matters worse TSMC is not having a great quarter which could also impact things

http://www.fudzilla.com/home/item/23958-tsmc-warns-things-not-as-good-as-they-should-be

I have a pretty good intuition with these things, and I think the rumors of the mid range chips coming first are DEFINITELY false.

With mid range chips you need quantities. Much Much more than a high end chip. I would gamble that mid range chips outsell high end chips 100:1 if not more. Why on earth would any of these companies launch their mid range chips first when both companies that manufacture their chips can't seem to get satisfactory results from their production this late in the game.

It's time for many of us to set our dreams aside for a second (myself included) and think with a clear and rational mind. With both fabrication companies struggling the worse they have in years getting good results from a die process If any chip is set to launch it'll be high end where a shipment of 4000 for the whole world is considered a hard launch. In mid range or mainstream you need 10s of thousands to be considered sufficient supplies.

I believe we will have a paper launch in mid Q1 2012 or a hard launch in early Q2 2012. I really hope things change soon but from what I'm getting this is my best guess. We need to start planning that we may be playing battlefield 3, Rage, Skyrim and Dead Island with what we have now. :(

The other possibility is Fud' are flaming fanboys and they're just repeating nVidia's, "It's all TSMC's fault", propaganda. We'll se. AMD, so far, hasn't backed down on any prior statements.
 
It's been expected for a while that what AMD means by launching this year is launching the low end to mid range products.
 
The other possibility is Fud' are flaming fanboys and they're just repeating nVidia's, "It's all TSMC's fault", propaganda. We'll se. AMD, so far, hasn't backed down on any prior statements.

Rumors are that TSMC's process is not as smooth as it ought to be. Charlie says it is pretty much in their hands now.
 
If the 7950 has a power consumption of 150 watt (i'm assuming that's load?), does that mean it would be possible to crossfire that on my current 650 watt power supply?

depends on what speed you're running your 2600k, and how far you'd overclock the 7950's. but good PSU's are so cheap these days, that i think you'd be better off getting a new one at the same time.
 
If they use less power than a 6970 (edit:250 watt tdp) and they will, then a GOOD 650 watt psu will work, that's about the safe limit though for a 650 watt PSU, even a good one.
 
Last edited:
Good to know. I was thinking of crossfiring my 6950 depending on how it runs BF3, but after seeing the power consumption of the 7 series... it's kind of hard to justify. Might as well save the money from the electricity bill. :p
 
Good to know. I was thinking of crossfiring my 6950 depending on how it runs BF3, but after seeing the power consumption of the 7 series... it's kind of hard to justify. Might as well save the money from the electricity bill. :p

electricity bill?
we don't talk about such non-[H] things here...
 
Two 6950s should run fine on a 650w PSU, if the 150w power consumption is right it should work without a hitch
 
The highest I've seen my two HD6970s draw (OCCT PSU stress test, so effectively furmark + linpack on the CPU) with quite a power-hungry platform, is around 700W out of the PSU. So if you're running two 6950s, you're looking at 600W tops in an extreme stress environment, probably less. The only stumbling block is four 6-pins. Not all 650W units have them.
 
The other possibility is Fud' are flaming fanboys and they're just repeating nVidia's, "It's all TSMC's fault", propaganda. We'll se. AMD, so far, hasn't backed down on any prior statements.
Yup. I love people who do nothing but regurgitate the word vomit fudzilla likes to pass as "news" lol.

Here is some real news for a change:
At the Deutsche Bank Technology conference in Las Vegas today (http://conferences.db.com/americas/tech11/), Thomas Seifert mentioned that they'd be demoing the world's first 28nm GPU. So it looks like this event: http://amd-member.com/newsletters/DevCentral/FusionZone2011.html, might have both Bulldozer and a sneak peek at the new GPUs.

Oh and it seems there will not be a dual architecture split (there is no need because GCN based GPUs will be able to do dual graphics with vliw based APUs like Trinity), so the HD7K series will be a top to bottom new architecture.
 
Last edited:
Nah, still too early for that. They said Q4 so I'm not expecting them any earlier than that.
 
Rumors are that TSMC's process is not as smooth as it ought to be. Charlie says it is pretty much in their hands now.
Well, come to think of it, when has a TSMC process ever gone smoothly? Short of 55nm :p I mean, I don't think anyones does well off the bat, just Intel gets a free pass since they are Intel, AMD/GloFlo are just ignored since they are normally irrevelent in this.


Just IMO :p
 
TSMC has gotten much much worse over the years. Really everyone except Intel has. Below 90nm lithography is very hard to do since they're pushing their equipment and known physics to the limit to produce working hardware.
 
TSMC has gotten much much worse over the years. Really everyone except Intel has. Below 90nm lithography is very hard to do since they're pushing their equipment and known physics to the limit to produce working hardware.

You are forgetting the fact that Intel is not a foundry so they can devote all their resources to taping out their own designs and tailoring their process tech to suit those designs.
TSMC on the other hand has considerably more tape outs per year than Intel because they are a foundry business, so they have to work with many different clients whose designs have different tolerances and requirements. Tailoring a process to the needs of each individual design takes considerably more time and effort. It's actually pretty amazing TSMC has kept up all these years.
 
You are forgetting the fact that Intel is not a foundry so they can devote all their resources to taping out their own designs and tailoring their process tech to suit those designs.
TSMC on the other hand has considerably more tape outs per year than Intel because they are a foundry business, so they have to work with many different clients whose designs have different tolerances and requirements. Tailoring a process to the needs of each individual design takes considerably more time and effort. It's actually pretty amazing TSMC has kept up all these years.

Yeah. Intel's 32nm process was only used on a 1000USD+ chip, and on a tiny stinkin dual core die that had many of its parts put on a seperate 45nm daughter die. :p

Only now do we see it on a mainstream "high end" part with okay pricing :p
 
You are forgetting the fact that Intel is not a foundry so they can devote all their resources to taping out their own designs and tailoring their process tech to suit those designs.
That is a design tool issue not a process issue.

TSMC on the other hand has considerably more tape outs per year than Intel because they are a foundry business, so they have to work with many different clients whose designs have different tolerances and requirements. Tailoring a process to the needs of each individual design takes considerably more time and effort. It's actually pretty amazing TSMC has kept up all these years.
Its great and all they get lots of tape outs but Intel still has a ~2 yr lead on everyone else when it comes to process performance while supposedly still maintaining excellent yields. You're just gonna come off looking silly if you try to play down that mountain into a molehill.
 
Damn, they've been marketing bulldozer for nearly a year now. :(

Come on AMD, Intel needs some competition!
 
This isn't bulldozer, this is the HD7 series we're discussing here.
AMD Bulldozer is due out in a month (but, from what I've seen so far, it looks quite disappointing)
 
Ah, apologies I'm having one hell of a day.

But yes, I am excited about HD7 series (and Nvidia too, if they add plenty of display ports). I think bulldozer would have been good a year or two ago. Their take on adding more cores count turn out ok (MS has made threading a great deal easier in .net 4.5/Win8).

I'm looking forward to seeing what GPU's can do with double the transistor count (that too has been overdue)!
 
It's not as bad as Fermi, but nvidia are once again behind with Kepler, their first new products to market will be a 28nm shrink of the existing Fermi products, and even then they're likely to coincide with AMD's release. Kepler will probably be a few months behind. If with the new architecture nvidia can get their performance per watt as high as AMD's before the die shrunk, then produce a 250W TDP card from it, the performance of it should be quite impressive. (Typically performance/watt improves 50% or so with a dieshrink, and nvidia are currently about 20% behind AMD on this with their high-end parts. Apply these together on top of the GTX580 and we could see GTX590 level performance out of the next solo GPU. Bad news is, it probably won't hit for another 11-12 months).
 
no, Q1-Q2 2012.
Is this for real? Damn, I could really use an upgrade now for BF3 but been delaying because I thought the new AMD line was still due by oct-nov.

Right now I'm running GTX460 1GB. What are my options if I want at least 50% more performance in ~350€, HD 6970?
 
Right now I'm running GTX460 1GB. What are my options if I want at least 50% more performance in ~350€, HD 6970?

I've got a gtx460, I tried a 6970 and didn't really see a massive improvement. Unless you really need it, I think you will be kicking yourself once these are released.
 
As in 75-100% faster? As I understand, 6970 is roughly comparable to GTX570. First I thought about getting a GTX560 Ti, but the gains aren't worth the price imo.

So I was under the impression GTX570 would be roughly 50% faster than 460, but 6970 would run cooler and consume less power. That's why I thought about opting in AMD this time around.
 
As in 75-100% faster? As I understand, 6970 is roughly comparable to GTX570. First I thought about getting a GTX560 Ti, but the gains aren't worth the price imo.

So I was under the impression GTX570 would be roughly 50% faster than 460, but 6970 would run cooler and consume less power. That's why I thought about opting in AMD this time around.

i don't know...
i would say that a 6970 surpasses a 570, and lies between the 570 and 580, even beating the 580 in quite a few games.
 
At 1920x1200, the HD6970 is about 45% faster than the 1GB 460, or 60% faster than the 768MB version.
At higher resolutions than that the difference increases to about 55% and 75%.
The HD6970 is more powerful than the GTX570, but at 1920x1200 it is not really able to make use of the architectural differences and sits around the same performance level. It takes a lead in crossfire as CF scaling with the HD6 series is slightly better than SLI scaling on the GTX5s.
 
This really sucks, I will sell my 5770 to a friend this week because he needs a GPU and with living in China all of my other friends run laptops so this is probably my only chance to get any money back. I'm really hoping that a 7770-7850 comes out soon. I want a sup $200 card, preferably a sub $160.
 
The HD7850 is likely to appear for about $150-200, and will perform somewhere between an HD6870 and an HD6950, but probably won't be on sale until february-march. If you want to spend that sort of money now, just buy an HD6870.
 
yeah I said screw it and bought a 6950 2GB card (arrives tomorrow yay!)

Should the need arise I can throw another in there for maxing everything at 1920X1080 in games. While I would have loved to wait for the 7k series, just not worth it when I can get great deals on 6950 2GB cards today. it's worth paying the money to have 5 or 6 months of fun gaming vs wiating around with my thumb up my ass watching everyone else have fun.
 
I hope AMD can release these early next year. Like January or something. When do they usually release graphic cards during Q1?
 
I hope AMD can release these early next year. Like January or something. When do they usually release graphic cards during Q1?

The 6800s came out October last year, 6900s in December (although if I remember right they weren't really available until mid-January) and the 6990s weren't out until March.

Hopefully it'll be similar this time. I'd really like to hold off on my new rig until the 7970s are out, but I hate waiting...maybe I'll just grab some cheap sub-$150 card in the meantime as a placeholder...
 
Last edited:
Back
Top