AMD Selected by U.S. To Shape the Future of High Performance Computing

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
AMD today announced that it was selected for an award of $12.6 million for two research projects associated with the U.S. Department of Energy’s (DOE) Extreme-Scale Computing Research and Development Program, known as “FastForward.” The DOE award provides up to $9.6 million to AMD for processor-related research and up to $3 million for memory-related researchi. AMD’s award-winning AMD Opteron™ processor has powered many of the world’s largest supercomputers over the past decade and the company invented the world’s first and only Accelerated Processing Unit (APU).
 
Now, I like AMD as much as the next AMD fan (or intel non-fan), but isn't that a bit.... exaggerated with their accomplishments?
 
Well, AMD made that particular press release so one can probably expect it to be a little bit far-fetched and insane. Just like RIM, they need to generate as much positive energy as possible so investors don't panic sell just because they've been struggling to compete for the last five years.
 
We're all doomed.

The government was never technically savvy, just saying...
 
Now, I like AMD as much as the next AMD fan (or intel non-fan), but isn't that a bit.... exaggerated with their accomplishments?

AMD's Opteron line is darn good. Especially when performance/$ is concerned compared to Intel. There's a reason that almost no one on [H] Team 33 uses Intel Xeons for largescale folding, and everyone uses Opteron 4p.

AMD's desktop line is of course, a completely different story.
 
The government almost always awards contracts to the smaller bidder (with extreme bonus points for "diversity"). So an Intel vs AMD decision may have simply been a matter of picking the underdog for its own sake.
 
Now, I like AMD as much as the next AMD fan (or intel non-fan), but isn't that a bit.... exaggerated with their accomplishments?

That's why its called marketing. What I don't understand, is why companies try to market that BS here, most of the forums members aren't gonna buy that hogwash. Oh, I just answered my own question. They believe their own BS.
 
The US Government loves AMD for some reason. We got pallets of old AMD computers from them before.
 
AMD's Opteron line is darn good. Especially when performance/$ is concerned compared to Intel. There's a reason that almost no one on [H] Team 33 uses Intel Xeons for largescale folding, and everyone uses Opteron 4p.

AMD's desktop line is of course, a completely different story.

I thought that most of the [H]orde's computing power came from graphics processors and not CPUs since those kinds of tasks work better in highly parallel video chips.
 
A few notes for the AMD hate around here.

Yes, as Skripka mentioned, Opterons make a good basis for a supercomputer.

In case you missed it, 7970s have the best floating point performance out there (single point is almost always useless for supercomputers). Intel has some vaporware out there that may someday just barely beat it (but not a hypothetical dual GPU beast likely to beat said vapor to market).
 
I thought that most of the [H]orde's computing power came from graphics processors and not CPUs since those kinds of tasks work better in highly parallel video chips.

If you're in the top-20 producers for [H]orde you have multiple 4p setups. Currently the top-20 account for 3/4 of [H]orde's production. With a 200-300W GPU you generate 10-25k PPD or so...with a 400W 4P setup from the wall you cook 10x that PPD. That is the maths as of now.

With new poorer PPD GPU WUs, and no client for the new GTX680 chips, eVGA got caught out. OTOH, with summer electrical rates and the heatwave in the US our prodcution has dropped too.
 
A few notes for the AMD hate around here.

Yes, as Skripka mentioned, Opterons make a good basis for a supercomputer.

In case you missed it, 7970s have the best floating point performance out there (single point is almost always useless for supercomputers). Intel has some vaporware out there that may someday just barely beat it (but not a hypothetical dual GPU beast likely to beat said vapor to market).

I don't think anyone in this thread is being hateful so far, though posts like that have a tendency bring out some extremist views in the same way people mention Apple hate in threads where people are just chatting and everything goes downhill from there.

In any event, AMD's graphics cards are nice, but their execution with respect to drivers has been pretty bad in recent history. The ATI arm of AMD is hanging on because there are still some talented engineers from ATI's days as a Canadian company that haven't jumped off the sinking ship yet. I hope they stick around to keep pushing out hardware, but AMD is failing in the same way VIA did, but on a larger scale.
 
If you're in the top-20 producers for [H]orde you have multiple 4p setups. Currently the top-20 account for 3/4 of [H]orde's production. With a 200-300W GPU you generate 10-25k PPD or so...with a 400W 4P setup from the wall you cook 10x that PPD. That is the maths as of now.

I suppose that makes sense. I really haven't seen the numbers myself so I'm sorta meh about the entire thing either way. It just seems odd that the limited long/slow pipeline of an Opteron processor bound to very slow system memory would be competitive with optimized software and workloads.

With new poorer PPD GPU WUs, and no client for the new GTX680 chips, eVGA got caught out. OTOH, with summer electrical rates and the heatwave in the US our prodcution has dropped too.

Yup, the nVidia types are no longer focusing on GPU compute with their desktop products and who can blame them? They have other products to target that market segment so there's no really good reason to spend the power and die-size budget on compute chores in a market that's largely looking just for framerate in whatever the console port of the week happens to be.

Bleah at power and heat though. Computers generate too much of it to be reasonable. It's a shame VIA never got much of anywhere with their Eden chips and Intel isn't able to push the Atom further into the desktop and laptop market.
 
In any event, AMD's graphics cards are nice, but their execution with respect to drivers has been pretty bad in recent history.

This is getting pretty tired. The only place where their drivers have been flaky is on Windows in regards to multi-card or eyefinity gaming. That is so irrelevant to this area and also to how most people use their cards. Additionally, their windows drivers for single cards are fine. At the very least, they haven't cooked cards like that fiasco with Nvidia's fan profiles.

If we want to talk about edge cases their binary blob for linux has been getting better and better. Also on linux, my ATI cards work flawlessly with the open source driver. Nouveau (Nvidia's open source driver) makes ATI's windows eyefinity problems look like minor hiccups. It's just a nightmare trying to get nouveau to do anything worthwhile.

More importantly, this is a thread about HPC. Depending on the workload, GPU-based solutions like Tesla may not even be appropriate or cost effective.
 
hmm... AMD is getting high performance computing contracts in both the US & China, interesting.

Level playing field I guess.
 
thread-going.jpg


:D
 
If you're in the top-20 producers for [H]orde you have multiple 4p setups. Currently the top-20 account for 3/4 of [H]orde's production. With a 200-300W GPU you generate 10-25k PPD or so...with a 400W 4P setup from the wall you cook 10x that PPD. That is the maths as of now.

With new poorer PPD GPU WUs, and no client for the new GTX680 chips, eVGA got caught out. OTOH, with summer electrical rates and the heatwave in the US our prodcution has dropped too.

Wowwww, it's been awhile since I last checked out the team rankings but you guys have played some major catch-up since then! Well done!
 
Back
Top