AMD: Zen will offer 40% faster performance per clock than Carrizo

Oh come on now. Almost all of the server space targeted by ARM is on Linux (web servers, etc.) and Linux / LAMP have been running on ARM for ages.

It's ultimately going to come down to who has the best product at the best price. Software compatibility isn't an issue.

Yeah, but then hardware.

Who's going to write all the drivers? Building architecture-agnostic drivers under Linux has not worked in the past, so I imagine almost everything we already have is x86 and PowerPC at most.

Drivers don't matter as much on a tablet that connects to nothing else, but in the server world you are expected to work with expansion hardware.

Who's going to provide a second-source for your processor, to keep costs down, and to cover your ass if AMD goes bust? That will certainly add to your acquisition costs.
 
Yeah, but then hardware.

Who's going to write all the drivers? Building architecture-agnostic drivers under Linux has not worked in the past, so I imagine almost everything we already have is x86 and PowerPC at most.

Drivers don't matter as much on a tablet that connects to nothing else, but in the server world you are expected to work with expansion hardware.

Who's going to provide a second-source for your processor, to keep costs down, and to cover your ass if AMD goes bust? That will certainly add to your acquisition costs.

What drivers used in the server space don't work on ARM?
 
I hope they manage it. I really want AMD to be competitive. For power consumption isn't a huge deal.Like an i7 cpu at 95w and they're able to give the same performance and clock speed at 125w and be cheaper would make probably move back to AMD. I'm still skeptical they can do it.
 
I hope they manage it. I really want AMD to be competitive. For power consumption isn't a huge deal.Like an i7 cpu at 95w and they're able to give the same performance and clock speed at 125w and be cheaper would make probably move back to AMD. I'm still skeptical they can do it.

IMO power consumption is an important factor. Not because of how much power it consume, but the fact that you have to dissipate almost all of those power in the form of heat. When you have too much power dissipation, you'll quickly hit a wall in terms of performance scaling with clock speed.

Intel learned from that mistake with Netburst. Prescott was a disaster due to it's thermal dissipation issue, and it proved that such approach lead to a dead end. Therefore, IMO for Zen to be competitive, AMD needs to ensure that it should have at least a decent balance of efficiency. That way they can do more with the architecture, and fulfill different market segment including mobile and enterprise, where power consumption and thermal dissipation are very important.
 
Zarathustra[H];1041874144 said:
If AMD can do the following, I will buy one.
- Single threaded performance on par with, or better than my 2011 i7-3930k overclocked to overclocked.
- At least 6 real cores. (I don't care about logical cores. Take them or leave them)
- No less than 40 PCIe lanes. (preferably more)

I want to support AMD, and I will buy this product, even if it has marginal or no performance gains. I will - however - not buy anything in 2016 or 2017 (whenever it comes out) that performs worse than what I bought in 2011.

This.

I try to keep 3 gaming rigs running. I =want= AMD to be in at least one of them. One AMD gpu in the mix, and one AMD cpu in the mix. It's kind of hard to do that these days.

C'mon, AMD.

(In ~3-5 months, the 1090T rig is going away. It'll be replaced by an i5. I'm sure the performance gains will be spectacular, as will the drop in heat output. I've held out...too long...on that rig.)
 
IMO power consumption is an important factor. Not because of how much power it consume, but the fact that you have to dissipate almost all of those power in the form of heat. When you have too much power dissipation, you'll quickly hit a wall in terms of performance scaling with clock speed.

Intel learned from that mistake with Netburst. Prescott was a disaster due to it's thermal dissipation issue, and it proved that such approach lead to a dead end. Therefore, IMO for Zen to be competitive, AMD needs to ensure that it should have at least a decent balance of efficiency. That way they can do more with the architecture, and fulfill different market segment including mobile and enterprise, where power consumption and thermal dissipation are very important.

My bold added, above.

This hits the nail on the head. I have two rigs in the same room: an i7 with 5 drives, ram, gtx970, etc., and the 1090T w/an HD6870 gpu and fewer of everything. The i7 stays on 24/7 (sleep mode as needed). The room is fine, temperature wise, even with some long gaming sessions. As soon as the 1090T is on, even without gaming, the heat increase to the room seems significant. Any gaming on it, and the room gets very warm.

I don't care about the electrical bill. Yeah, I pay it, so it's my choice. I do care about making the room uncomfortably warm every time the rig is running. The thermostat is at the other end of the floor, so cranking it down to cool that room makes the rest of the floor a bit too chilled for comfort. It reminds me of my sister's easy-bake oven when we were kids. ;) (Yeah, a bunch of noctua fans, etc., etc. The room is hot because I've got the case cooling off the electronics pretty well. Big heat dump.)
 
IMO power consumption is an important factor. Not because of how much power it consume, but the fact that you have to dissipate almost all of those power in the form of heat. When you have too much power dissipation, you'll quickly hit a wall in terms of performance scaling with clock speed.

Intel learned from that mistake with Netburst. Prescott was a disaster due to it's thermal dissipation issue, and it proved that such approach lead to a dead end. Therefore, IMO for Zen to be competitive, AMD needs to ensure that it should have at least a decent balance of efficiency. That way they can do more with the architecture, and fulfill different market segment including mobile and enterprise, where power consumption and thermal dissipation are very important.

In datacenters actual power consumption is important. Not just the heat you have to dissipate. For the home user I'd generally agree.
 
This.

I try to keep 3 gaming rigs running. I =want= AMD to be in at least one of them. One AMD gpu in the mix, and one AMD cpu in the mix. It's kind of hard to do that these days.

C'mon, AMD.

(In ~3-5 months, the 1090T rig is going away. It'll be replaced by an i5. I'm sure the performance gains will be spectacular, as will the drop in heat output. I've held out...too long...on that rig.)

Similar situation - my 3 year old 8350/990FX desktop is on it's last legs. Since I use it for work and can't be without it, I dropped the money on an i7 6700k last night. Really didn't want to do that considering I hadn't bought any Intel in 15 years, but I couldn't justify re-buying what I bought in 2012.
 
Similar situation - my 3 year old 8350/990FX desktop is on it's last legs. Since I use it for work and can't be without it, I dropped the money on an i7 6700k last night. Really didn't want to do that considering I hadn't bought any Intel in 15 years, but I couldn't justify re-buying what I bought in 2012.

I would have kept the 8350 longer. What makes it on its last legs? I would have upgraded to Haswell-E, or Skylake-E (when they are released) you get much more out of the upgrade.
 
I think it is a combination motherboard and cpu problem. About a year ago I started having issues running 4x8gb Corsair DDR3 1600 sticks. Dropped down the memory speed to 1066 and it was fine for a while, but eventually even that didn't keep the stability - so a few months I took out the 2nd pair of 8gb sticks and I had no issues for a while. And yes I did verify the two I pulled out work ok in one of my Kaveri systems.

Then a few weeks ago I started seeing my cpu temps hit 65'C+. Figuring maybe the thermal paste after almost 3 years had gone a bit thin, I swapped in a Corsair 100 gtx closed loop water cooler since I had great success in other builds with that. No dice, still had high temps and if I was compiling larger solutions (which I do frequently) in VS it would crash.

Like I said previously, not thrilled to drop money on Intel now only to drop more money when Zen comes out.
 
I recall hearing something about sabertooth VRM getting really hot if you have a spare fan might be something you could try ...

I'm not to sure about this topic since Zen is still a year away from launch. What I did pick up from another website is that the architectural changes from CMT to SMT alone would yield a significant boost in IPC, the other stuff is that the manufacturing process is finfet which has different properties from the previous used by AMD for cpu.

I'm not to worried about Zen in general. The idea that you spend all that money on getting Jim Keller aboard and then completely fail at something he is pretty good at would be a rather small change.
 
Last edited:
IMO power consumption is an important factor. Not because of how much power it consume, but the fact that you have to dissipate almost all of those power in the form of heat. When you have too much power dissipation, you'll quickly hit a wall in terms of performance scaling with clock speed.

Intel learned from that mistake with Netburst. Prescott was a disaster due to it's thermal dissipation issue, and it proved that such approach lead to a dead end. Therefore, IMO for Zen to be competitive, AMD needs to ensure that it should have at least a decent balance of efficiency. That way they can do more with the architecture, and fulfill different market segment including mobile and enterprise, where power consumption and thermal dissipation are very important.

For me 95w vs 125w isn't a big deal if the performance is close. Granted I would like the highest performing CPUs to be 95w or lower but I can tolerate 125w. But if they have do something like FX-9590 that is 220w I won't be switching back.
 
Oh come on now. Almost all of the server space targeted by ARM is on Linux (web servers, etc.) and Linux / LAMP have been running on ARM for ages.

It's ultimately going to come down to who has the best product at the best price. Software compatibility isn't an issue.

Wrong. If software compatibility wasn't an issue, then WinRT would have succeeded, and X86 as an architecture would have died off 20 years ago with the Itanium. Software matters, moreso then the hardware.
 
Wrong. If software compatibility wasn't an issue, then WinRT would have succeeded, and X86 as an architecture would have died off 20 years ago with the Itanium. Software matters, moreso then the hardware.

This. Backwards compatibility is the precise reason x86 has been so successful.
 
Only if the cost of emulation was not so high in terms of performance loss. Transmeta thought they could design a chip to reduce the overhead but in the end it did not work out well enough for them to be successful.
 
It depends you can see that Nintendo made some money using emulation on the Wii. And today it seems an acceptable way to make people pay again for old stuff on consoles ;) .
 
... We're talking server architectures?

I'm sure that there is homebrew enough to make your Wii a server ;) .

I still can't wrap my head around windows RT failing I mean they had office right that is all the software you would ever need. Was someone in here saying that Windows RT was for servers ?

Where is this discussion going someone tell me this is not about windows RT?

:)
 
Zarathustra[H];1041883882 said:
The only reason Apple has been able to pull that off - twice - is because of their rabidly loyal fanbase.

That and they did the Arch transition well. With osx apple introduced a new programming model that relied on apple libraries and so moving from one arch to the next was (mostly) a matter of moving those libraries from one arch to the next which made everything much easier.
https://en.wikipedia.org/wiki/Cocoa_(API)
 
I'll believe it when I see it.
10/16 is a long ways away.
Especially when it's something they've (AMD) needed since 2011.
If it's good, I'll definitely get one.
Why is Jim Keller not on the project now, though? That seems like a red flag to me.
 
Wrong. If software compatibility wasn't an issue, then WinRT would have succeeded, and X86 as an architecture would have died off 20 years ago with the Itanium. Software matters, moreso then the hardware.

Let's stay in context here.

1) Windows does not have the kind of cross-platform compatibility that Linux does. In terms of its libraries, compiler toolchains, etc. That's why WinRT was a failure. It wasn't a matter of just recompiling a software package and tweaking out any quirks, like Canonical did with their, oh, entire Ubuntu repository. Besides, MS had a vested interest in pushing users towards x86. They never really seemed interested in ARM. And WinRT was a locked-down OS like iOS. That might fly in Apple land, but...

2) The context here is ARM servers. ARM servers. That means smaller servers like hosting web sites or whatever else you need a bunch of "slow" threads for. Most web hosting is on Linux. All of that software has been running on ARM for like a million years.

So in the context of the discussion -- ARM servers -- software compatibility is hardly the problem. The issue is going to come down to cost and whether the sacrifice of single-threaded performance is acceptable.
 
Zarathustra[H];1041883882 said:
The only reason Apple has been able to pull that off - twice - is because of their rabidly loyal fanbase.


that and most apple users don't know shit about their computer either hardware or software.
 
Why is Jim Keller not on the project now, though? That seems like a red flag to me.

Keller has always historically left companies after finishing his work on the designs. He has finished his work on Zen and Zen+ and doesn't have much else to do, so he left. According to people in the company, his departure was known to be a thing well before it happened, which is normal. He is done with the bits he's good at (high-level design and overall planning, he also had some duties such as organizing their roadmaps and helping with execution of those) so the rest of his team will handle the rest. Zen has taped out months ago and they're gonna spend the rest of the period before volume production verifying, sampling, testing and all that jazz as usual. Nothing worth waving a red flag over as this is par for the course.
 
I'll believe it when I see it.
10/16 is a long ways away.
Especially when it's something they've (AMD) needed since 2011.
If it's good, I'll definitely get one.
Why is Jim Keller not on the project now, though? That seems like a red flag to me.

The real question should be "Where is the news on K12?". I think it's more then likely that project was canned due to cost reasons, which in turn forced Keller to leave AMD.

This also implies Zen is pretty much it, product wise.

This is weird, that is why you get reply like this:

Is PCX2 code that shitty ?

It's pretty good, but some titles are VERY CPU intensive.
 
This also implies Zen is pretty much it, product wise.

But is that bad. For AMD to ignore the AM3+ for several years and their business going a different direction without seeing much cash flow did you expect that AMD would keep pouring money into a segment where they are struggling to make any money.

In the end it would not be surprising if AMD puts their eggs in different baskets rather then going all in one basket. Hoping that ARM will make them some money and that they can keep making stuff which has long term cash flow consoles was a step in the right direction (they don't have to compete with anyone nor do they have to fear the competition every OEM cycle).
 
Last edited:
If Zen can't make PCX2 run Zone of Enders smoothly then AMD is dead to me CPU wise.

Why would you spend lots of money on a gaming rig with a fast processor, if your priority is to run lame old PS2 games on it?

Just get the real thing off of craigslist for less than $40 instead and call it a day :p
 
Zarathustra[H];1041886073 said:
Why would you spend lots of money on a gaming rig with a fast processor, if your priority is to run lame old PS2 games on it?

Just get the real thing off of craigslist for less than $40 instead and call it a day :p

QFT. Emulation can be woefully inaccurate to boot. There is no substitute for original hardware.
 
Which is why I have a 386, 486 and Pentium setups with original hardware (sound cards, etc) for my vintage gaming needs. ;-)

:D

Well, for things within the PC family, I find DosBox does a really good job though.

8 bit era console emulators are also pretty good.

Anything later than that though, and IMHO, the emulators start to become problematic.

The 16 bit era emulators all try to get fancy and upscale resolution in ways that just looks dumb.

Emulators after the 16 bit era, I have found to mostly be crippled by bugs or bad code, or to just not run particularly well.

I tried PCX2 once because I had a late night craving for some Katamari Damacy, but I found it to be a buggy, horror show with terrible design/usability.
 
Running the games 4x - 6x the internal resolution makes them look amazing on PCX2; like having the HD collection of the game. Also I do have a PS2, but don't really feel like dedicating space for it on my cramped desk.

Either way if it fails to run PCX2 on one of the hardest game smoothly I can always upgrade to zen then trade it back to my brother for my old G3258 build.
 
Last edited:
LOL! Thanks for the reminders. DOS and SoundBlaster; IRQ conflicts; massive 200Mb hard-drives; PATA cables; and paying extra for the "math co-processor" so "X-Wing" would fly better! Ahhh, the good ol' days...
 
The real question should be "Where is the news on K12?". I think it's more then likely that project was canned due to cost reasons, which in turn forced Keller to leave AMD.

This also implies Zen is pretty much it, product wise.

AMD already said that K12 was merely pushed back in favor of x86 since they realized that customers weren't so privy to ARM as they'd thought. K12 is still a thing but they opted to throw more R&D at finalizing Zen rather than K12. I don't see how this would "force" Keller to leave AMD since even his part in the design of K12 is finished well before he left.

Is PCX2 code that shitty ?

PCSX2's main issue is with the gsdx plugin, the one responsible for basically emulating the GS (Graphics Synthesizer) chip of the PS2 and the EE as well. It was written by a guy who is very skilled at what he does, but his coding style is almost alien to everyone else since he didn't really document it much. Add to that the fact that he's nowhere near as active as he used to be and thus only appears every once in a blue moon to fix stuff or add on to it, and that's why the few contributors we have on the project currently can't do much about it. Although one guy did make a lot of progress on the OpenGL version of the plugin and as such the OGL version is much faster now and more accurate as well than the DX version.

Basically we could get huge speed-ups if the plugin were re-written, but that would be a lot of work. I think the guy behind gsdx hinted that he might come back and do a DX12 or Vulkan version (Vulkan would make more sense) of the plugin if he were to re-write, so that'd be nice. Some games are very CPU intensive, for various reasons. But most games on the system are totally playable and all of them don't required overclocked i5's to get full speed on.

Why would you spend lots of money on a gaming rig with a fast processor, if your priority is to run lame old PS2 games on it?

PS2 had a lot of fuckin' awesome games that never came to PC or any other platform, and it's nice to be able to play them natively upscaled and in some cases with smoother framerates/performance and faster loading. A lot of the emulation is accurate today even under the hardware renderer, and if accuracy is what you want, there's always the software renderer which has a very high accuracy rate today.

Of course the PS2 is still the best way to play if you don't wanna deal with all the nonsense of system requirements and revisions of the emu and plugins and all that jazz.
 
QFT. Emulation can be woefully inaccurate to boot. There is no substitute for original hardware.

That's true, but there are advantageous to emulators too.

I still have my PS2, but when I hooked it up to my LCD tv, it looked really bad. PS1 games are even worse. LCD screen just doesn't handle those low res well.

On an emulator, the graphics can be improved in ways that will improve our experience (at least for me). For example, on the PS1 emulator, I'm using a "2xSal Smart" filter which applies a certain blur effect to smooth everything out including the low res pre-rendered background. IMO, it's the closest we can get to playing PS1 games on a CRT tv, without having a CRT tv.

Even on the PS3, there's such an option to apply a smooth filter for PS1 games, and it does a great job IMO. Though I'm not going to repurchase my entire PS1 library from PSN, so I'll settle with a PC emulator.
 
To those who are concerned that AMD has nothing new between now and 2016, I guess my take is as follows.

AMD is SIGNIFICANTLY behind Intel today.

They make their CPU sales from people who are not that interested in high end performance, or who shop out of brand loyalty.

They could spend money and engineering resources on yet another incremental revision on the flawed bulldozer architecture and get what, another 5% boost in performance? They'd still likely be stuck on the same process node.

In the end it comes down to this. Is being significantly behind intel really that much worse than being significantly behind intel +5%?

They will still likely make some dribs and drabs in sales from the same folks who are currently buying, and the truth is that 5% likely wouldn't make much of a difference.

As long as they have the cash on hand to be able to survive until a Zen launch date, it makes very much sense to cut their losses on the current lines, and go full steam ahead with all available resources on what promises to be a major leap, in large part due to architecture redesign, but also in large part because the smaller process nodes will be available then.

I think this is the right decision to make. Spending more money for yet another incremental development of a dead platform only to still be significantly behind their main competitor right now would be a very silly thing to do.

There is some risk here though. Zen ABSOLUTELY MUST live up to expectations, or they might as well liquidate and go home.

They do not have enough cash on hand (or enough assets to sell off) to survive yet another bulldozer...
 
Zarathustra[H];1041886508 said:
To those who are concerned that AMD has nothing new between now and 2016, I guess my take is as follows.

AMD is SIGNIFICANTLY behind Intel today.

They make their CPU sales from people who are not that interested in high end performance, or who shop out of brand loyalty.

They could spend money and engineering resources on yet another incremental revision on the flawed bulldozer architecture and get what, another 5% boost in performance? They'd still likely be stuck on the same process node.

In the end it comes down to this. Is being significantly behind intel really that much worse than being significantly behind intel +5%?

They will still likely make some dribs and drabs in sales from the same folks who are currently buying, and the truth is that 5% likely wouldn't make much of a difference.

As long as they have the cash on hand to be able to survive until a Zen launch date, it makes very much sense to cut their losses on the current lines, and go full steam ahead with all available resources on what promises to be a major leap, in large part due to architecture redesign, but also in large part because the smaller process nodes will be available then.

I think this is the right decision to make. Spending more money for yet another incremental development of a dead platform only to still be significantly behind their main competitor right now would be a very silly thing to do.

There is some risk here though. Zen ABSOLUTELY MUST live up to expectations, or they might as well liquidate and go home.

They do not have enough cash on hand (or enough assets to sell off) to survive yet another bulldozer...

If they were to do that now, then yes, it would be a waste of money. However, 2 to 3 years ago, a new, more advanced chipset and somewhat better CPU based on the Steamroller cores would have been very much welcome. In fact, I remember those days because they were so silent in letting us know what was going to happen.

Personally, I think they lost a lot of loyal customers and following when the remained silent back then. I am not sure they will ever have a loyal a following again and I own and FX 8320 and 8350.
 
Back
Top