AMD Zen Rumours Point to Earlier Than Expected Release

yup.. few examples:

Xeon E5-2640 v38c/16t, turbo 3.4ghz 90W.

Xeon E5-2630 v3 8c/16t, turbo 3.2ghz 80W

Xeon E5-2630L v3 8c-16t turbo 2.9ghz 55W yeah 55W

What about big brothers?

Xeon E5-2660 v3 10c/20t turbo 3.3ghz 105W.

Xeon E5-2650L v3 12c/24t turbo 2.5ghz 65W yeah.. 65W for 24 haswell Threads.

Xeon E5-2683 v3 14c/28t turbo 3ghz 120W.

what about some 130W+?

Xeon E5-2698 V3 16c/32t 3.6ghz turbo 135W

Xeon E5-2698 v3 18c/32t 3.6ghz turbo 145W... so yeah those are a lot of haswell cores and threads for 145W.

of course problem of Xeons are the price, but you can find a lot of 8c/16t for less than 600$ as others like 16/32 for above 2K$.. but threads/Watts are there to anyone who are more concerned about horsepower per watt than money...


Good links, yeah Intel is milking the high end lol.
 
@Derfnofred:

Yes, the Intel 6 and 8 core processors pull just as much juice as the 9590 when OC'd. Just that they also do a lot more processing for the power that they use. I do find it amusing that people act like they are more energy efficient. I suppose that they are slightly more energy efficient though due to the process node that they are on.

iow, it doesn't matter how high the ipc is--if it's using more power, it's a POS.

Obviously a fallacy, but it does seem people act that way at times. Also not saying that's what cagemaru meant when he said this--honestly, I'm having a bit of a time making heads or tails of what he said.
 
Guys, this is ridiculous. I'm not sure what windmills you're tilting at.

If processor A and processor B consume the same power, but processor A does
a lot more processing for the power that they use
, then processor A is absolutely more power efficient than processor B.

Hence why I said that IOPS or FLOPS (or some weighted composite of various instructions) per watt IS energy efficiency. (the seconds term cancels from both, giving us instructions/joule of energy) End of story. Other than sweeping said instructions/joule through various workloads, there's no other appropriate narrative when talking about energy efficiency of a processor.

IPC and power are interrelated, but not directly. Not sure where the heck you're going with that. End user cares about how fast something gets done, not whether it's a sky-high clock cycle on a lower IPC or vice-versa.

*Likewise there's a lot of embedded, ultra-low power stuff which is both energy efficient and incredibly slow! For, you know, those instances where you don't need to calculate things very quickly, and you want to get 5 years out of a watch battery.
 
Guys, this is ridiculous. I'm not sure what windmills you're tilting at.

If processor A and processor B consume the same power, but processor A does , then processor A is absolutely more power efficient than processor B.

Hence why I said that IOPS or FLOPS (or some weighted composite of various instructions) per watt IS energy efficiency. (the seconds term cancels from both, giving us instructions/joule of energy) End of story. Other than sweeping said instructions/joule through various workloads, there's no other appropriate narrative when talking about energy efficiency of a processor.

IPC and power are interrelated, but not directly. Not sure where the heck you're going with that. End user cares about how fast something gets done, not whether it's a sky-high clock cycle on a lower IPC or vice-versa.

*Likewise there's a lot of embedded, ultra-low power stuff which is both energy efficient and incredibly slow! For, you know, those instances where you don't need to calculate things very quickly, and you want to get 5 years out of a watch battery.

Yep some people just don't understand efficiency is always a ratio, work being done over time, cost, or in this case power consumed.
 
Intel prices are nearly same as in Nehalem era

Only exception is price gougged Skylake but that's result of Intel not producing enough

Really? I7 860 launched at a $284 msrp. The price of that tier cpu without adding any more cores in all of those years has slowly crept up. 4770k launched at $339. 3770k launched at $313 and sandy was about the same.

Now skylake is $350 and broadwell is even more.

All still quad core low power cpus.

Intel needs some competition. You don't see gpu dies getting smaller with every process and performance staying pretty similar.
 
Hey I still get 60 fps in all my games. ;) I wish I had USB 3.1 though. I still have USB 3.0 so I guess it's not so bad over here. There are new 990FX chipsets with USB 3.1

I saw the 970 gaming from gigabyte as well ;) with new USB 3.1 100 Euro ;) .
 
Really? I7 860 launched at a $284 msrp. The price of that tier cpu without adding any more cores in all of those years has slowly crept up. 4770k launched at $339. 3770k launched at $313 and sandy was about the same.

Now skylake is $350 and broadwell is even more.

All still quad core low power cpus.

Intel needs some competition. You don't see gpu dies getting smaller with every process and performance staying pretty similar.


The process is more expensive now, so price increases in the same bracket could be just to cover that. The price increase is only around 10-20% which you have to include inflation as well.
 
The process is more expensive now, so price increases in the same bracket could be just to cover that. The price increase is only around 10-20% which you have to include inflation as well.

The price would definitely be lower, and K skus wouldn't be a thing if AMD didnt fuck up so bad. Intel right now is like Apple had there been no android.
I would love to buy Zen if it holds up to the rumors but 2016 too late to upgrade
 
maybe but not that much, see there can be a counterfactual monopoly which Intel is kinda right now. But because you have sunk costs which will change gradual up with new processes being more expensive which they are, and of course inflation.

The CPU market is douopoly with AMD and Intel, but right now AMD's products are weaker. So competition is suffering but price isn't being lowered is what some of us are saying, but that really isn't the case because of market equilibrium.

CPU industry is a mature market, mature markets have a price equilibrium because innovation has slowed down, which Intel is doing because they really don't need to innovate.

With people's upgrading tendencies and reasons to upgrade or buy new systems, with out an equilibrium, Intel won't be able to sell chips if they start pricing too high. Nor will they go too low either if there was competition, you can probably factor in like 10% due to competition at most, because there are sunk costs *costs that can't be recovered which are things like R&D, manufacturing, etc.

Think of it this way, Intel and AMD at the top end there is no competition for Intel, in the mid range same thing, AMD is only competitive performance wise at the low end.

What Intel is doing is increasing their margins on the low end by not innovating their chips, They may increase just a fraction in the mid range, but at the high end consumer processors they are maintaining the same margins they have had before. (this is seen in economics of mature markets with leadership of that market by one company and has been seen in many times). Now by doing so, overall margins increase as a result of this because the low end chips tend to have much less margins when AMD was competitive and these are volume sales items, much more so than the mid range and high end.
 
You get what you pay for I guess. Buy Intel and be stuck in a non competitive environment.
 
You get what you pay for I guess. Buy Intel and be stuck in a non competitive environment.

Right now it is bad it can get for non-competitive, but there will be a handful for people who will buy inferior product to support competition.
Most people don't have the extra money / heart for this.
 
Well, it's not a problem when you don't need top-end performance--not like you need to upgrade your CPU every year anyway. And AMD's top-end stuff isn't that expensive anyway, just doesn't quite match up with Intel's top-end, especially if you ignore price.
 
You get what you pay for I guess. Buy Intel and be stuck in a non competitive environment.

Come on dude, I love AMD and even I know the only upgrade path I had was Intel. (I went from a FX 8350 to an I7 6700k but, I could have gone to a 5820k had I so desired.) The FX 9590 would have been fun had I purchased an Asus Crosshair V Formula Z but, investing in old tech would not have made sense no matter now much fun I would have had.

Now I am on my new platform at home for at least 2 to 4 years. Video card upgrading is the only real thing I will want to upgrade in that time frame though.
 
Right now it is bad it can get for non-competitive, but there will be a handful for people who will buy inferior product to support competition.
Most people don't have the extra money / heart for this.

There is nothing inferior about the FX 8 core series of processors. When you are talking 4k gaming, you may not see a difference in certain games if you can get a good overclock. Also, I use an FX8350 with 32 GB of ram at work and although I would like to have a 5820k there, it would ultimately be a waste of money at this time since I would not see any real difference between the two there.

Now, I upgraded to an 6700K at home because I wanted something to last me a few years with new tech on the board. However, as I said above, the Crosshair mainboard and FX 9590 would have been fun to use and fast too. However, it would not have lasted me the length of time I would have wanted out of it.
 
The process is more expensive now, so price increases in the same bracket could be just to cover that. The price increase is only around 10-20% which you have to include inflation as well.

And the die is how much smaller? How many more chips are they fitting on a wafer? ;)
 
Chips on a wafer =/= to yields and PPU

Yes, because Intel has never had to worry about yields when launching a new node. Come on man. Lets be real here.

Why don't Nvidia and AMD have these issues? Those fuckers can squeeze out a 50% performance boost on the same node.

I don't give a shit about process technology if it doesn't bring about better performance. Quite frankly neither would Intel if there weren't money in doing so.
 
And the die is how much smaller? How many more chips are they fitting on a wafer? ;)

4 core Ivy Bridge chips were on 32 nm which were 216mm2, 14nm skylake 4 core chips are 123nm2


Die size didn't change that much ;)

now as the node changes gets smaller the cost per wafer has gone up. So the inherent cost isn't directly proportional to the die size over different nodes.
 
Last edited:
Yes, because Intel has never had to worry about yields when launching a new node. Come on man. Lets be real here.

Why don't Nvidia and AMD have these issues? Those fuckers can squeeze out a 50% performance boost on the same node.

I don't give a shit about process technology if it doesn't bring about better performance. Quite frankly neither would Intel if there weren't money in doing so.

I was just playing a little Devil's advocate on your statement, thats all ;)
 
Thanks. I think you went a little overboard, but at least I can read your post on my laptop :)
 
4 core Ivy Bridge chips were on 32 nm which were 216mm2, 14nm skylake 4 core chips are 123nm2


Die size didn't change that much ;)

So, you're backing up what I'm saying?

now as the node changes gets smaller the cost per wafer has gone up. So the inherent cost isn't directly proportional to the die size over different nodes.

Source?
 
Its common sense. New process, lower yields on immature process. Lower yields =increased cost per chip = increased cost of wafer.

Yup, just like every new tock. Also common sense that when the die is just getting smaller and smaller they're making money.

The lack of performance improvements at a given price point in the last half a decade compared to markets with actual competition tells me everything that I need to know.
 
How anyone can defend Intel's lack of performance and pricing of their "mainstream" chips at this point I'll never understand.

Its due to lack of competition. Plain and simple.
 
When you have a narrative, why look for evidence to the contrary?

Pro tip: rarely are things plain and simple. Obviously facile things are much easier to discuss.

Node shrinks are obviously to increase and/or maintain viability in the market. Ultimately, yes, cost per transistor is dropping, but slowly, and at a very very substantial capital investment. That does mean that cost per wafer is increasing (transistor density). Semiconductors are a mature industry, by and large, and there are forces that are going to (ostensibly) stagnate "innovation" that are entirely unrelated to a lack of competition. Being hard-up against physics might be one of those.

It amazes me how much people take for granted how far we've come in terms of innovation, and how hard won much of that has been. It's not an effortless process, and the crumb size gets smaller and smaller over time.
 
When you have a narrative, why look for evidence to the contrary?

Pro tip: rarely are things plain and simple. Obviously facile things are much easier to discuss.

Node shrinks are obviously to increase and/or maintain viability in the market. Ultimately, yes, cost per transistor is dropping, but slowly, and at a very very substantial capital investment. That does mean that cost per wafer is increasing (transistor density). Semiconductors are a mature industry, by and large, and there are forces that are going to (ostensibly) stagnate "innovation" that are entirely unrelated to a lack of competition. Being hard-up against physics might be one of those.

It amazes me how much people take for granted how far we've come in terms of innovation, and how hard won much of that has been. It's not an effortless process, and the crumb size gets smaller and smaller over time.

You're wasting your time explaining things to people that just want to argue. :)
 
Yup, just like every new tock. Also common sense that when the die is just getting smaller and smaller they're making money.

The lack of performance improvements at a given price point in the last half a decade compared to markets with actual competition tells me everything that I need to know.

Per transistor new processes are cheaper not per mm2.

It costs more for R&D, tooling, and testing for new nodes.

This is saying die size has been going up and down during the past 10 years, yet chip prices remaining around 10%-15% of the cost even though die size has increased as much as 100% and also 10% below. Cost of end product is being equalized for the most part because of market supply and demand. This only happens in mature markets where corporations are well aware of how much they can charge for a product even if its a monopoly (which it kinda of already is right now) that will be reasonable for consumers to purchase while maintaining certain profit margins by the company, and by doing so they are still pushing AMD out of the low end, which is inflating their per chip margins and profit margins. If they price to high they start loosing not only in the high end but also at the low end. Competition helps to a degree but its actually small in a mature market, if both companies in this case a duopoly are competitive in all segments, still have to maintain margins for successive chips and keep investors happy so they really won't cut their own profit margins much, but resellers might cut into there.
 
Last edited:
When you have a narrative, why look for evidence to the contrary?

Pro tip: rarely are things plain and simple. Obviously facile things are much easier to discuss.

Node shrinks are obviously to increase and/or maintain viability in the market. Ultimately, yes, cost per transistor is dropping, but slowly, and at a very very substantial capital investment. That does mean that cost per wafer is increasing (transistor density). Semiconductors are a mature industry, by and large, and there are forces that are going to (ostensibly) stagnate "innovation" that are entirely unrelated to a lack of competition. Being hard-up against physics might be one of those.

It amazes me how much people take for granted how far we've come in terms of innovation, and how hard won much of that has been. It's not an effortless process, and the crumb size gets smaller and smaller over time.

Exactly, good post!
 
cost per wafer has gone up significantly in the past 4 to 5 node drops. Specially at launch of a new node.

http://techsoda.com/no-20nm-graphics-amd-nvidia/

The savings on similar die products on a new smaller node is non existent and new dies that are similar size are actually more expensive to make

With wafer sizes remaining fairly same size for the past decade, individual chip costs have gone up.

And this isn't purely on the node either

http://electroiq.com/petes-posts/2015/01/26/exponentially-rising-costs-will-bring-changes/

Chip design costs have increased too.
 

After a wall of text and images you get this conclusion:
Thus it’s absolutely clear and no longer secret by any measure that we’re bound to see AMD come out with APUs that feature integrated High Bandwidth Memory in addition to other integrated components via die stacking. It’s simply the natural progression that has slowly been taking place over the past several years, culminated in the first ever such product “Fiji” last year. But that’s only the first step of many to come. And we can’t be more excited to see where die stacking can take the industry!

How can things be a secret if AMD themselves post about the features of HBM. That AMD will do things beyond graphics cards is clear but the "grasping at straws" that WCCFTECH is just creating hits rather then coming with any kind of proof of an actual production of an APU.

If you know Zen is due before end of this year and Zen APU are slated for 2017 this is rather premature.
 
Back
Top