Cascade Lake-X 10980XE 5.1Ghz boost all cores

To the 'But AMD was the victim' argument:

Note that Intel made AMD as they are by providing them with business and necessary licenses.

It's in the past, it's done, and AMD has done quite a bit since, however, it also makes sense that Intel would want to cease that relationship when it was no longer needed, i.e., they were no longer bound by IBM to ensure a second source.
 
Holy Cow this thread... Please lock it now.
My eyes are bleeding from all this back and forth of pissing contest.
 
It's in the past, it's done, and AMD has done quite a bit since
No, we are not going to forget Bulldozer, damn it; that CMT architecture will not be forgotten so easily, nor will this thread be locked!
I do get the feeling AMD will never be forgiven for Bulldozer, the same way Intel will never be forgiven for Netburst. :D
 
No, we are not going to forget Bulldozer, damn it; that CMT architecture will not be forgotten so easily, nor will this thread be locked!
I do get the feeling AMD will never be forgiven for Bulldozer, the same way Intel will never be forgiven for Netburst. :D

Both are arguably why marketing teams must never be allowed to drive designs.
 
I would say they are both at fault. There are degrees, and Intel was found to be "more guilty", and they paid for it. AMD was better at being slimy while not doing anything illegal (that we know of). I don't mind disagreements to this characterization of AMD's behavior... I was not there (working at AMD or Intel). But then neither were you.

All the details of the (history of the) agreements and reasons for them, I agree. I just did not feel like digging up all of those details.

The requirement for a dual supplier was obviously obsolete when many companies besides IBM started making PC's. When the requirement was no longer there, Intel stopped sharing the cpu "blueprint" with AMD. AMD, not wanting to be left out, starting cloning intel chips, and only with the K5 made their own.

What I am not sure of, is whether having a cross-licensing agreement means that one would HAVE to let the other have access to ALL of their tech.

At some point Intel did not want to keep giving away intellectual property. They got AMD into the desktop processor market (granted at IBM's requirement). We can make an educated guess that Intel was tired of a competitor reaping benefits that they developed. Then Intel went overboard and into illegal territory, and got dinged for it.

As I said before, I think AMD has done good things for the processor market. I'm not going to "buy" or "not buy" products from either, based on past mistakes they or the other have made. Legal mistakes, bad behavior, design mistakes, whatever. (There's plenty of that to spread around).

Again...How does using the arbitration clause in your contract, a document which BOTH companies agreed to, make you slimy?
 
To the 'But AMD was the victim' argument:

Note that Intel made AMD as they are by providing them with business and necessary licenses.

It's in the past, it's done, and AMD has done quite a bit since, however, it also makes sense that Intel would want to cease that relationship when it was no longer needed, i.e., they were no longer bound by IBM to ensure a second source.

AMD was doing fine without a single agreement with Intel. They were a smaller company but they were profitable throughout the 70s and, once the "chip dumping" of japanese companies was reduced to acceptable levels in the late 70's & early 80's they (and many other companies) were even more profitable.

The reason Intel revoked the cross license agreement was because AMD was becoming more profitable than Intel. At the time, early 80's, Intel's fastest CPU was the 80286 12.5MHz while AMD was pushing out 16MHZ, 20MHz, and 25MHz 80286's which were selling as fast as they were made. AMD was eating Intel's lunch making chips twice as fast. That's why Intel revoked the cross-license agreement.
 
The cross-license was never terminated. Intel failed to abide by it and produce the microcode as outlined in the agreement. Amd had to sue for it - and because of other underhanded intel practices. The cross-license exists today, in modified form. It is regularly reviewed, modified and renewed.

“AMD had brought suit against Intel in 1987 for breach of contract. Intel was bound by a 1982 cross license to give AMD microcode for its x86 processors.”

ref:
http://www.cpushack.com/2012/09/06/intel-vs-the-world-the-338-patent/
 
Something else... don't forget no one gives a shit about x86 anymore since the world has chosen AMD64 as the defacto standard.
 
The cross-license was never terminated. Intel failed to abide by it and produce the microcode as outlined in the agreement. Amd had to sue for it - and because of other underhanded intel practices. The cross-license exists today, in modified form. It is regularly reviewed, modified and renewed.

“AMD had brought suit against Intel in 1987 for breach of contract. Intel was bound by a 1982 cross license to give AMD microcode for its x86 processors.”

ref:
http://www.cpushack.com/2012/09/06/intel-vs-the-world-the-338-patent/

According to the testimony of Hector Ruiz, " Intel reacted by cancelling the 1982 technological-exchange agreement altogether"
 
According to the testimony of Hector Ruiz, " Intel reacted by cancelling the 1982 technological-exchange agreement altogether"
Ok fine if that’s the case. Anyways Amd sued and won. Cross licensing re-instated. That should tell you that x86/x64 development is so incestuous that both intel and Amd legally need it to survive.
 
He did buy ATI after all when he was head of AMD.

Putting aside the Intel shenanigans that compounded the problems in that era, Ruiz did drive AMD into the ground. Just ask Jim Keller what AMD was like back then with them firing people left and right (he references it in the Moore's Law is not dead talk at Berkeley. And most of all the divisions they sold off to stay afloat. Adreno anyone? Oh and on the ATI purchase, he/they applied gpu design process to cpus that led to the biggest turd in company history...
 
Putting aside the Intel shenanigans that compounded the problems in that era, Ruiz did drive AMD into the ground. Just ask Jim Keller what AMD was like back then with them firing people left and right (he references it in the Moore's Law is not dead talk at Berkeley. And most of all the divisions they sold off to stay afloat. Adreno anyone? Oh and on the ATI purchase, he/they applied gpu design process to cpus that led to the biggest turd in company history...

Hehe I agree it sucked under his watch. Just wanted to mention the one big positive.
 
Something else... don't forget no one gives a shit about x86 anymore since the world has chosen AMD64 as the defacto standard.
X64 is an Amd modification to allow x86 cores to address 64bit memory addresses. X86 core lives as a subset of x64.

AMD64 is x86-64 in general, which is IA32 with registers extended for 64bit use. It also wasn't the least bit relevant at the time for desktop use because the only 64bit kernel available for Windows was Server-based (which XP 64 was a respin of). It wasn't until much later with Vista 64bit did consumers start using the extension, and it wasn't really until that time that the memory was affordable let alone useful for common consumer applications, which also had to be coded for 64bit.

Also note that coding in 64bit doesn't have much utility if an application does not need access to the memory or the extra percision offered, and instead, only serves to unnecessarily bloat the executable. We still see many applications run in 32bit today, too, on all operating systems.

In the end, as mentioned before, AMD64 itself wasn't an innovation, it was a successful marketing coup. There was not much effort involved on the technical side and AMDs real win was support from Microsoft first in the release of Server 2003 x64, then XP x64 and finally mainsteam Vista x64 releases.
 
10 cores and also 44 or whatever PCIe lanes, don't forget.

I get a kick out of Minecraft streamers running systems built on stuff like this, but it's not really what they're for. Whatever the equivalent of the 9900k will be will be cheaper.

i seem to remember there being demo's of the amd chips straight punishing the intel chips when it came to streaming (+gaming)
 
X64 is an Amd modification to allow x86 cores to address 64bit memory addresses. X86 core lives as a subset of x64.

That's incorrect. It's not a subset or a modification. Google it. Intel tried to push IA64 which was not compatible (think $$$ ) with x86 and failed miserably. AMD saw how stupid that was and created AMD64, ie. 64bit instruction set for x86. Guess who needs a license to use AMD64 aka x64? Basically, the world runs on x64 or AMD64...


Is this where we introduce the three apps that use avx-512 now?

Hehe. I don't think I would use avx512 as it just about doubles power draw, it's insane.
 
Also note that coding in 64bit doesn't have much utility if an application does not need access to the memory or the extra percision offered, and instead, only serves to unnecessarily bloat the executable.

except when it can process 64 bits in one clock cycle vs 32...
 
Thanks for proving my point. x64 is a subset of x64. The x64 is memory addressing.

Dude don't argue semantics. Maybe you didn't get the point that you don't need x86 ISA. Did you miss that part since that's EXACTLY what Intel tried to do? AMD created AMD64 which is an ISA unto itself, ie. x64 ISA. The difference is AMD made it backward compatible. AMD doesn't work for Intel, so them making something for Intel's x86 ISA... that's the way I'm reading it like so correct me if that's not what you meant.
 
. The difference is AMD made it backward compatible. AMD doesn't work for Intel, so them making something for Intel's x86 ISA... that's the way I'm reading it like so correct me if that's not what you meant.

Yes, relax. This is what I am talking about.
 
Yeah memory plus a couple xtra instructions. you exaggerate and divide hairs.

"64 vs x64
x86 instruction set architecture is 32 bit while x64 instruction set architecture is 64 bits. x64 came as an extension of the existing x86 architecture. The registers, memory bus, data bus on x86 architectures are 32 bits while this is 64 bits on x64. Therefore, the maximum amount of memory addressable is very much higher in x64 systems than in x86 systems. x86 was introduced by Intel with the 8086 processor that was a 16 bit processor and with the time this x86 was extended to 32 bit. Then later, AMD introduced the x64 architecture by extending the existing x86 architecture and this x64 is fully backward compatible with x86 instruction set."
 
I don't think I would use avx512 as it just about doubles power draw, it's insane.

If you are optimizing for max power draw, then it is a poor choice for sure.

If you're optimizing for computations/watt or time-to-complete it is pretty nice. As of right now, it's not something consumers need fret about. For scientific workloads in the horrible middle area between conventional CPU and massively-parallel-GPU - it's really quite nice.

In general though, it's a good thing, assuming you can handle peak power.
 
If it weren't for AMD finally getting competitive within the last few years, we would still be running quad-cores...

I mean, most people would, because that'd be overkill.

But Intel had slated eight-core 10nm CPUs to release four years ago back when they believed their own foundry peoples' estimates...
 
Back
Top