AMD Bulldozer 'Core' Lawsuit: AMD Settlement

Capacity? Performance? Flexibility to set things up the way you want it?

I tend to always try to do things with a real server abd avoid appliances as much as possible, including even more Enterprise oriented brands like QNAP.

Whenever I can use server hardware, I do. Even my router is running on PC hardware with using pfSense.

Also, IMHO, it sounds like a really bad idea to out your storage on the same device that is your WAN bridge and firewall.

My storage server has 2x 8c/16t xeons 256GB of RAM, 12x 10TB hard drives, 8x SSD's in various caching etc. purposes, and dual 10gig Ethernet adapters.

I can probably get about 2GB/s reads off of the hard drive storage array, and it supports a lot of stuff I do in my home.

Can't do that with a raspberry pi. :p

Sounds fair enough. I only asked as an educational exercise because, given my use case, I couldn't conceive of needing to dedicate an entire PC build for a file server as the most demand I ever place on my network storage would be an occasional Bluray Rip, Steam Backup, or system image transfer. It would seem our needs are vastly different considering you went a step further by employing enterprise grade servers. There's no way I could justify that for my 1 bedroom apartment lol.
 
Sounds fair enough. I only asked as an educational exercise because, given my use case, I couldn't conceive of needing to dedicate an entire PC build for a file server as the most demand I ever place on my network storage would be an occasional Bluray Rip, Steam Backup, or system image transfer. It would seem our needs are vastly different considering you went a step further by employing enterprise grade servers. There's no way I could justify that for my 1 bedroom apartment lol.


Well for me it started with a little 5 bay Drobo in 2012. Then I wanted it to be remote, so I attached it to a mini-PC running Linux.

Then I got tired of the poor performance of the thing, so I built a Free as box.

Then I wanted to run other stuff on my Free as box as well, so I installed VMWare ESXi and ran VM's and then things have just kind of spiraled out of control from there.
 
I'm curious. What are the advantages of re-purposing an x86 desktop as a file server instead of Just using a NAS, Rasberry Pi, or plugging an ext storage drive into the router?
You use what you got on hand at any given time. That's what I had to work with.

This is why I didn't use the Raspberry Pi that had just come out at the time:
"While operating at 700 MHz by default, the first generation Raspberry Pi provided a real-world performance roughly equivalent to 0.041 GFLOPS. On the CPU level the performance is similar to a 300 MHz Pentium II of 1997–99. The GPU provides 1 Gpixel/s or 1.5 Gtexel/s of graphics processing or 24 GFLOPS of general purpose computing performance. The graphical capabilities of the Raspberry Pi are roughly equivalent to the performance of the Xbox of 2001."

I had all the hardware to take that re-purposed desktop and turn it into a blazing, hardware RAID controller driven monstrosity with 8 drives. I was using WHS at the time.
 
You use what you got on hand at any given time. That's what I had to work with.

This is why I didn't use the Raspberry Pi that had just come out at the time:
"While operating at 700 MHz by default, the first generation Raspberry Pi provided a real-world performance roughly equivalent to 0.041 GFLOPS. On the CPU level the performance is similar to a 300 MHz Pentium II of 1997–99. The GPU provides 1 Gpixel/s or 1.5 Gtexel/s of graphics processing or 24 GFLOPS of general purpose computing performance. The graphical capabilities of the Raspberry Pi are roughly equivalent to the performance of the Xbox of 2001."

I had all the hardware to take that re-purposed desktop and turn it into a blazing, hardware RAID controller driven monstrosity with 8 drives. I was using WHS at the time.

I too used an FX8120, and later an FX8350 for a home server for a while.

While they absolutely sucked ass as desktop chips, they were actually really decent server CPU's.

In the end what killed it for me was that I wanted to run more VM's and the RAM limit of 32GB (using the largest available 8GB DDR3 sticks at the time) just wasn't enough.

That's when I upgraded to my first dual Xeon server, which I recently upgraded to my second one.
 
While they absolutely sucked ass as desktop chips,
how so? i had both those and they were fine for normal desktop and gaming use.
gaming-wise i havent noticed much of a difference in performance going to the 2600x. only thing is "phyics" really improved.
 
how so? i had both those and they were fine for normal desktop and gaming use.
gaming-wise i havent noticed much of a difference in performance going to the 2600x. only thing is "phyics" really improved.

Well, for standard desktop use, yes, but for that you could probably get away with a 10 year old Atom.

The FX series had abysmal per thread performance which held me back in many titles and applications, most notably Red Orchestra 2 which was my favorite title back then. It actually performed better on my Phenom II 1090T than it did on the FX-8150 I briefly used before selling it.

There was absolutely nothing I liked about AMD's FX chips, until I put one in a server.
 
how so? i had both those and they were fine for normal desktop and gaming use.
gaming-wise i havent noticed much of a difference in performance going to the 2600x. only thing is "phyics" really improved.
Single thread performance was utter garbage. I forget the exact numbers, but the Phenom II was actually a stronger processor in almost every aspect. The thing that killed me on the desktop, in gaming specifically (and some older publishing applications) was the single thread performance. Many of your older 32 bit games were not multithread aware. The one that really jumps out in my mind was Sword of the Stars. It ran like complete ass on the FX series. It ran damn near flawless on my Phenom II though and it ran even better on Intel... anything Intel.

The other game that really chugged with that processor is Sins of a Solar Empire, especially when you load it up with mods like Sins of a Galactic Empire.

As always, really depends heavily upon your use case. I know plenty of people that loved the 8 core FX chips and they swore by them.
 
how so? i had both those and they were fine for normal desktop and gaming use.
gaming-wise i havent noticed much of a difference in performance going to the 2600x. only thing is "phyics" really improved.

In 'CPU-lite' usecases they were fine outside of being innefficient. They really only made sense in budget-constrained scenarios though.
 
i see. i used mine for browsing, movies and gaming. never seemed to hold me back but i guess i wasnt doing something it didnt like. maybe the 4.5-4.8GHz i had them at helped?!
 
i see. i used mine for browsing, movies and gaming. never seemed to hold me back but i guess i wasn't doing something it didn't like. maybe the 4.5-4.8GHz i had them at helped?!

Near 5Ghz overclocks certainly helped, but was never enough to close the gap with Intel, even at stock clocks.

I forget the numbers now, but the FX had something like a 40% IPC deficit compared to contemporary Intel chips.
 
I got my court appointed email for this today (purchased the 8350 the day it was released). I'm not making a claim. That would be admitting that I was an idiot who didn't know what he was buying and I definitely knew how the multicore worked on it, it's not like that was a secret.
 
i see. i used mine for browsing, movies and gaming. never seemed to hold me back but i guess i wasnt doing something it didnt like. maybe the 4.5-4.8GHz i had them at helped?!
I had a buddy that was rocking the OC on his and it did narrow the gap in performance deficit. Depends on what you were playing at the time. He was a big Eve Online player and was successfully running multiple sessions of the game simultaneously and had no issues at all.
 
I got my court appointed email for this today (purchased the 8350 the day it was released). I'm not making a claim. That would be admitting that I was an idiot who didn't know what he was buying and I definitely knew how the multicore worked on it, it's not like that was a secret.
1. AMD already lost the money
2. I don't understand why you actually care at this point. File the request, claim some money and possibly even get your entire mistake investment returned to you. Honestly, you're more of an idiot if you don't make a claim ;).
3. No one cares about how you feel about that ages old purchase, except you.

I will never think you're an idiot unless you give me a reason to (y)
 
Our poor receptionist spent the last couple days going through all of our cpu orders and printing off proof of purchases to mail in for this.

I wonder if we will see the rest of the bulldozer lineup come up next. If we can file claims against 6 core cpus we would need to send the proof of purchases in a box lol.
 
Our poor receptionist spent the last couple days going through all of our cpu orders and printing off proof of purchases to mail in for this.

I wonder if we will see the rest of the bulldozer lineup come up next. If we can file claims against 6 core cpus we would need to send the proof of purchases in a box lol.
I think it's just the 8 Core Parts, because of the narrative of the witch hunt that this sham of a court battle was. If I recall the big argument was that 8 core parts were actually more like quad core's with hyperthreading. I don't recall anything about 6 core parts being 3 core CPU's with hyperthreading. This was all about people who were butt hurt over the 8 core parts and the AMD marketing not being stellar.

Who knows..
 
1. AMD already lost the money
2. I don't understand why you actually care at this point. File the request, claim some money and possibly even get your entire mistake investment returned to you. Honestly, you're more of an idiot if you don't make a claim ;).
3. No one cares about how you feel about that ages old purchase, except you.

I will never think you're an idiot unless you give me a reason to (y)

i dont consider it a mistake, i knew exactly what i was buying. I think the lawsuit is stupid and setup for stupid people. It's not about other people thinking i'm stupid, it's about admitting i was by making the claim. Similar to admitting guilt by paying a ticket. It's not worth a couple hundred bucks.
 
i dont consider it a mistake, i knew exactly what i was buying. I think the lawsuit is stupid and setup for stupid people. It's not about other people thinking i'm stupid, it's about admitting i was by making the claim. Similar to admitting guilt by paying a ticket. It's not worth a couple hundred bucks.

Cool, then I am happy to be stupid in my pursuit of 209 bucks. AMD already lost the suit, I am just cashing in on a judgement I had nothing to do with and one that I actively argued against every time it came up on a forum or in conversation.

If you will not allow yourself to accept free money, good for you. Your statement about anyone that will profit off this judgement being an idiot does give me reason to pause. Everyone is entitled to their own opinion. However, as an informational thread your opinion seems rather combative instead of being helpful.
 
you can call people who buy top tier hardware without knowing what they're buying whatever you want. I choose the simplest most likely term.
 
If I recall the big argument was that 8 core parts were actually more like quad core's with hyperthreading. I don't recall anything about 6 core parts being 3 core CPU's with hyperthreading.

If it applies to the eight-core parts, it applies to all of them. They were all sold with modules being represented as two 'cores', however many modules were enabled.
 
If it applies to the eight-core parts, it applies to all of them. They were all sold with modules being represented as two 'cores', however many modules were enabled.

Yeah. Total bullshit argument (there were definitely two real cores per module), but if you buy the argument for the octacores, you should also buy it for the others, as long as you didn't get a model with modules with one of the two cores per module disabled. I vaguely recall there were some quad cores like this, but I am not 100% sure.
 
Yeah. Total bullshit argument (there were definitely two real cores per module)

My basic understanding is that there were two ALUs and one FPU per module. On a CPU basis with four modules versus a desktop i7 with four cores, that's actually more physical ALUs on the Bulldozer than the i7, however, also on a CPU basis the AMD modules were slower. Slower even than the cores on the architecture they replaced, and I think that being slower is the 'core' of the argument.

No one would have cared if they were faster.

I still like the idea of Bulldozer but it was clear upon launch that AMD had fumbled significantly relative to Intel, and it remains disappointing both that AMD never really got Bulldozer competitive and that it took so long for them to return to producing competitive parts that didn't have to be bargain priced to sell.
 
Was it really uncompetitive when Intel was cheating with insecure cpu's? Those i7's take a 14-17% perf hit on average workloads compared to 8350 with current mitigations. Add the price difference and I don't think bulldozer was as bad as it appeared at the time.

Even more so if you have to disable ht due to higher security concerns.
 
The question is was there any significantly better cpu available for the same price from Intel?

Not if Intel offered significantly more expensive cpu's for more performance.

That picture changes significantly when Intel is playing the same game with mitigations and not cheating like they were back then.
 
30% more expensive for 20% more performance still leans in favor of AMD.

Even the 2500k was more expensive, which ends up about equal to the AMD chip on average.
 
30% more expensive for 20% more performance still leans in favor of AMD.

Sure, AMD has a track record of being good for the budget. That's been true for a very long time and is not in dispute. But if you needed that 20% extra performance it didn't matter how much extra it cost.

Until recently, if you needed performance that was Intel. Now that AMD has a brilliant architecture under its belt again, it's excelling in price and performance. No one should have ever mistaken Bulldozer as a performance chip.
 
You're missing the point. If the argument is that a customer would have purchased an intel cpu if they had known that the amd cpu was not going to perform like they thought then what intel cpu would they have been able to purchase for the <200 bucks that the amd cpu's went for at the time? The i5 2700k was released at around 220 bucks. The performance difference between that cpu and the amd cpu they were hypothetically duped on is roughly the same when looking at an average of a bunch of benchmarks (which I posted a link to above). So the only way a customer would have gotten significantly better performance than the amd part would be if they spent significantly more money.

That is, when the intel parts are forced to be as secure as the amd part.

So if the alternative performed roughly the same for the same price, what damage is being done to the customer? If they were willing to pay 30+% more they would not have ever considered the amd part in the first place

The idea of there being damages because the customers would have chosen the intel parts only applies if you forget that the intel cpu's at the same price as the amd chip were only faster because they were insecure.
 
So the only way a customer would have gotten significantly better performance than the amd part would be if they spent significantly more money.

I'm not missing any point -- this is my point, but you're determined to appear to disagree regardless.

The idea of there being damages because the customers would have chosen the intel parts only applies if you forget that the intel cpu's at the same price as the amd chip were only faster because they were insecure.

Nothing is secure. Welcome to reality!

Over time, every weakness may be exploited.

And at the time, such issues hadn't been discovered. Thus, they're meaningless in the context of the comparisons that you're making. There is no 'if', because at that time and for over five years later, no one knew.
 
I got my AMD settlement check a few days ago.

I claimed two CPU's, my original FX 8120 I bought for my server, and the FX 8350 I later replaced it with.

I knew what I was getting into when I bought mine, and I think this lawsuit was a total sham, but once AMD paid the money, they are not getting it back, so I figured I might as well file my claim anyway.

Wound up with a check for $60.84, so I guess after all the claims settled, the going rate per CPU was $30.42
 
1587837200425.png
 

Heh, Yeah, Kyle thought it was funny to change that after I started a thread about how user interfaces have gone to shit in the mobile era.

Apparently he didn't think that was a real issue worth bringing up.

I had earned my own title, but then I guess I unearned it by daring to mention how I think the modern flat interfaces with tons of dead space and huge fisher price looking buttons for everything aren't very usable. Oh well. I don't really care.
 
Last edited:
I got my AMD settlement check a few days ago.

I claimed two CPU's, my original FX 8120 I bought for my server, and the FX 8350 I later replaced it with.

I knew what I was getting into when I bought mine, and I think this lawsuit was a total sham, but once AMD paid the money, they are not getting it back, so I figured I might as well file my claim anyway.

Wound up with a check for $60.84, so I guess after all the claims settled, the going rate per CPU was $30.42
Will confirm that - I claimed one for the FX-8320 I bought within the period designated by the terms and got a check for that amount. No complaints, even if I also thought the lawsuit was garbage, but if they're handing out free money because I bought a qualifying CPU, I'll also get in line.
 
Ya i got my 30 bucks back it cost me 20 bucks for the cpu it was on sale for 50 !!!!!!!!!!!!
 
Back
Top