Intel Core i7-3960X - Sandy Bridge E Processor Review @ [H]

I found a few things in the article that might need corrections:

a) Is the political commentary needed in a hardware review? Unless turning a hardware discussion thread into a political discussion is desired :)

b) dropdown for the article page 5 says "Archivining"

c) page 4 "but it surely we become and e-peen synthetic benchmark to be held out" wut?

d) intro for the article "Intel as done the impossible"

e) page 1 "Along with is huge 130 watt TDP"

f) page 2 "this system on the 8960X" did you mean 3960X?

g) page 2 "Processor cooling used on all test systems is a Corsair H100 self-contained water cooling system. We moved away from using this system on the 8960X due to the fact that we were constrained on time after waiting on Asus to deliver mass production BIOS flashes to us."
a little later on page 2 "We are using the Corsair H100 for all testing here today. "
I'm confused as to what "moved away from using" means? What was used on the 3960X instead of the H100, was it a koolance, or was the H100 also used on the 3960X?

h) page 6 "and encoding it with to an MKV format"

i) page 7 "the best suited applications areas"
 
Last edited:
I too find this a bit hilarious. SB adopters were absolutely scathing in their criticism of anyone (like myself) who decided the gains coming from Nehalem were too insignificant to justify the cost. Now, these same people balk at the very idea of SB-E, something that clearly outclasses their precious new setup.

Vega will grab one because he can use it. The wanna-be's will keep on trashing SB-E.

Too much epeen and emotion surrounds this sort of stuff.

I think it's more a case of a still-poor economy, along with a healthy dose of computing reality, finally delivering the sledgehammer blow to the noggin.

Except for the top ten *current* PC gaming titles (and not even all of the top ten) how many take full advantage of even Q6xxx and a midrange (HD67xx) GPU? If they do, merely upgrading to i5-2500K will put the CPU into *loafing mode*; if they don't, then that almost certainly means that the game in question is not multicore-aware or even multicore-friendly.

HD67xx? I specified that level for two reasons - it's the floor level of powered-GPU capability (typically either a single 6-pin or dual-Molex is what is warranted), and it's not pricey (retail for this area is no worse than $130, and has dipped down into the $100USD range, via either rebates or sale pricing). Teaming such a GPU with even Q6xxx clocked stock is not going to bottleneck any decently-written game.

Kyle indeed has the right of it - SB-E for gamers is odious - and odiferous - overkill that commits the additional "sin" of being Extremely Expensive.
 
so want and i will pay for the extreme edition anything to speed up.my 3D Studio max and maya render times. i will push that thing and the quad channel ram bandwidth to its limits. im going to have so many fricking mudbox zbrush sculpted high poly crap on the screen i need it bad. dont have to use displacment maps maybe...
 
so want and i will pay for the extreme edition anything to speed up.my 3D Studio max and maya render times. i will push that thing and the quad channel ram bandwidth to its limits. im going to have so many fricking mudbox zbrush sculpted high poly crap on the screen i need it bad. dont have to use displacment maps maybe...

Bingo.

Intel designed this chip for people like you.
 
While I don't disagree with any of the points that Kyle brings up, I don't really see this as anything new.

Intel has been doing EE procs for quite a while now, and none of them have made a lot of sense, value wise. While certainly, the increased power consumption sets off bells in my head, I don't think Intel should be criticized for what is essentially business as usual.

That said, I really cant understand why Intel wouldn't send [H] a K to review. EE procs have never, and I hope will never be [H]'s style, but the K (might) be more on the mark. Pretty sure the entire crew has been around long enough that you will get yawns at best from a 1000 dollar processor.
 
i thought this was the most retarded thing intel has launched. i mean who the hell has the money [or atleast the need ] to go out and buy this, if they have a SB already? or in my case an i-7 920... i dont need anything more, plus i dont see spending 1000 + for just a processor that i dont need at all. much less the $300+ board. i dont even see paying for the 600$ processor that i dont need at all. and i game all the time, 2 core for gaming 2 cores for other stuff and the OS.
 
Kyle, question for you:

Anand saw a very small idle power bump between the new SB-E chip and SB. Your review shows a 60W difference! He does use 4 DIMMs on his SB system, but that should make the SB system draw more power, not make the SB-E system draw less.

Any ideas as to where the difference is? Maybe the motherboard? (He used an Intel board, you used a high-end Asus board.)

Toms also saw a lot lower idle power in line with normal SB chips. Tested using an Intel board and the Asus ROG (but doesn't state which the power tests were done on).

Edit: See you posted above with the full list.
 
Toms also saw a lot lower idle power in line with normal SB chips. Tested using an Intel board and the Asus ROG (but doesn't state which the power tests were done on).
Given how neatly it falls into line with the other Intel DX79SI numbers, I'd say it was a fair bet that's the board that was used. Asus seems to incur an instant 50W penalty for some reason.
 
i thought this was the most retarded thing intel has launched. i mean who the hell has the money [or atleast the need ] to go out and buy this, if they have a SB already? or in my case an i-7 920... i dont need anything more, plus i dont see spending 1000 + for just a processor that i dont need at all. much less the $300+ board. i dont even see paying for the 600$ processor that i dont need at all. and i game all the time, 2 core for gaming 2 cores for other stuff and the OS.

I assume you see 580/590, 6990 etc. GPU's as 'retarded' as well then? What about dual cards let alone tri or quad sli...
 
Well, as an i7-930 user, I am happy to see that I don't have any immediate need to upgrade ---yet. While this CPU is pretty darn impressive in performance compared to a stock 920. As in finishing some encoding tasks in 1/2 the time, my 920 is overclocked so its only about 30% faster. I'd never buy a 30% increase in performance for a 350% increase in price though and a 920 versus a 3960X seems to be about that. $300 versus $1k+.

When the 'next' generation of CPUs from intel comes out, if we can see this level of performance or better from their $300 dollar range processors, I'll probably upgrade at that point. Can't wait for Ivy's 22nm goodness.
 
Given how neatly it falls into line with the other Intel DX79SI numbers, I'd say it was a fair bet that's the board that was used. Asus seems to incur an instant 50W penalty for some reason.

Yep, agreed.

However, just as interesting looking at those numbers, is the difference in idle power across the 2600K tests (what motherboard was each of those done on?).

Quite a big difference there as well so the motherboard has a lot to do with it. Perhaps we need motherboard comparisons using the same chip.
 
Also, who the hell cares about power consumption when the CPU costs $1000? Are you telling me, if you have the money to get it, you would skimp on the PSU and cooling?

If you consider power use strictly from an economic perspective I can kind of see this.

The cost difference with my local electricity costs (15.882 cents/KWH) is $7.07 (stock) or $7.65 (OC) compared to a 2600K if you idle 24/7. If you are at full load 24/7, the monthyl cost difference is $17.70 (stock) or $24.35 load.

Considering I usually idle maybe 4 hours a day and spend maybe 2 hours a day at some kind of load (but almost never full load except for stability testing) the costs of the power are mostly insignificant.

On the other hand, if you consider the cumulative damage of using more power than you absolutely have to, it makes more sense, regardless of your income level.

Even if you are one of those people who deny all reason (human effected climate change) there is a case to be made against wasting electricity.

1.) Burning coal is really really bad. It puts heavy metals (Like mercury and lead) into the atmosphere for us to breath, and undoubtedly shortens most of our lives.

2.) Less energy use = more energy freedom. Not needing to depend on a part of the world that hates us for our energy is a pretty good thing IMHO.
 
i thought this was the most retarded thing intel has launched. i mean who the hell has the money [or atleast the need ] to go out and buy this, if they have a SB already? or in my case an i-7 920... i dont need anything more, plus i dont see spending 1000 + for just a processor that i dont need at all. much less the $300+ board. i dont even see paying for the 600$ processor that i dont need at all. and i game all the time, 2 core for gaming 2 cores for other stuff and the OS.

Money is all relative to people's current standing. You ask 'who the hell' has the money to go out and buy this? Well, lots of people in higher-end professions or well-paying jobs like actors, doctors, lawyers, singers, CEOs, executives, day-traders, etc. If you just got paid $3 million dollars for making your last movie or are like the HP executive that just received a $30 million dollar golden parachute, spending $1,000.00 on your next pc upgrade might seem like nothing.

Intel is not retarded for launching this kind of thing, but, rather targetting a very specific market(content producers and people with more money than sense) while also trying to make people buy into the "We are the best -- b/c we hold the performance crown" belief. By being the top-dog in processors, they might very well convince a few thousand customers on the edge between whether to buy AMD or Intel, to go Intel. I can't blame them for trying.

That being said, I don't have $1,000 to burn on small incremental upgrades so I'll be waiting for Intel's next generation before I upgrade my overclocked i7-930. I'm also of the belief that a good SSD and a good SLI setup you could get for the same amount would make more of performance difference in video games than a small increase in processor power.

I'm sure a lot of Intel's marketing guys would like to try to argue the opposite but not a lot of [H]ardocpers would.
 
Well, as an i7-930 user, I am happy to see that I don't have any immediate need to upgrade ---yet. While this CPU is pretty darn impressive in performance compared to a stock 920. As in finishing some encoding tasks in 1/2 the time, my 920 is overclocked so its only about 30% faster. I'd never buy a 30% increase in performance for a 350% increase in price though and a 920 versus a 3960X seems to be about that. $300 versus $1k+.

But you are comparing the wrong area of the market anyway. You opted for a 920 rather than a 975/980 because that best suited you, if the same is still true then a 3960/3930 is not aimed at you anyway. The 3960/3930 are 'replacements' for the high end i7's.

It's like someone who's happy with a $100 GPU reading a review of a newer $600 GPU and then stating that they would never pay the increment for the %'age increase in performance and are happy to wait for the new $100 card in the same range.
 
Yep, agreed.

However, just as interesting looking at those numbers, is the difference in idle power across the 2600K tests (what motherboard was each of those done on?).

Quite a big difference there as well so the motherboard has a lot to do with it. Perhaps we need motherboard comparisons using the same chip.
Probably the video card. Hence why the deltas correlate so well even when the absolute numbers are shifted.
 
Zarathustra[H];1038022481 said:
If you consider power use strictly from an economic perspective I can kind of see this.

The cost difference with my local electricity costs (15.882 cents/KWH) is $7.07 (stock) or $7.65 (OC) compared to a 2600K if you idle 24/7. If you are at full load 24/7, the monthyl cost difference is $17.70 (stock) or $24.35 load.

Considering I usually idle maybe 4 hours a day and spend maybe 2 hours a day at some kind of load (but almost never full load except for stability testing) the costs of the power are mostly insignificant.

On the other hand, if you consider the cumulative damage of using more power than you absolutely have to, it makes more sense, regardless of your income level.

Even if you are one of those people who deny all reason (human effected climate change) there is a case to be made against wasting electricity.

1.) Burning coal is really really bad. It puts heavy metals (Like mercury and lead) into the atmosphere for us to breath, and undoubtedly shortens most of our lives.

2.) Less energy use = more energy freedom. Not needing to depend on a part of the world that hates us for our energy is a pretty good thing IMHO.

Totally agree but it's all relative, why aren't we all using Atom powered netbooks?

No personal dig but I notice also you use three screens, a 30" and two 20" - very nice, but the power useage is obvioulsy not a real concern to you otherwise you'd just be on one. If you need that much screen estate (and I don't doubt it - I have a 24" and three 19" myself) then you accept the power useage just as someone needing a 6 core over a 4 core over a 2 core etc. etc. accepts the extra wattage required to power them.
And the crossfire cards, and the server with all that lovely storage space...:D

Forgive me if I'm wrong, but I don't think you are out to save the world from CO2 emissions by reducing your useage. ;)
 
While I don't disagree with any of the points that Kyle brings up, I don't really see this as anything new.

Intel has been doing EE procs for quite a while now, and none of them have made a lot of sense, value wise. While certainly, the increased power consumption sets off bells in my head, I don't think Intel should be criticized for what is essentially business as usual.

That said, I really cant understand why Intel wouldn't send [H] a K to review. EE procs have never, and I hope will never be [H]'s style, but the K (might) be more on the mark. Pretty sure the entire crew has been around long enough that you will get yawns at best from a 1000 dollar processor.

I think the point you missed was that it was being marketed as an extreme gaming cpu. I think anyone who follows forums like these and has more than a few brain cells can testify that it's a load of bullshit -- like the review said. Couple that with rather piss-poor showing for the PCI 3.0 lanes and it becomes laughable. Unless you're planning to do your extreme gaming at 1024 then you want to avoid this chip.

I don't think we've had an "extreme" chip that was a fantastic gaming chip since back in the FX days, and arguably that had less to do with the chip than it did with how the games were developed and GPU utilization. But now things like the doodoodozer and desktop workstation chips like this are falsely being sold to gamers looking for any little edge they can get.

The CPU and platform provide a decent upgrade over current 2500k/2600k Sandy Bridge or even older core i7 1366 offerings, but if you take into account the price tag then you're left with only a few people who could actually use this thing and be actually worth the $ they've paid for the upgrade. Unless, of course, you want to market it as a gaming chip and target a wider audience...

then promptly get called out on your bullshit =P
 
I have no doubt that's what you saw, just trying to make sense of the numbers.

I did an informal poll of a few other review sites:

Code:
              2600K       3960X    Delta     Motherboard
Anand:         77.6        79.3      1.7     Intel DX79SI
Tom's:         90          87       -3       Intel DX79SI(?)
PCPer:         97.5       109.7     12.2     Intel DX79SI
Legit:         51          60        9       Intel DX79SI
Neoseeker      90          96        6       Intel DX79SI
Guru3D:       105          86      -19       MSI MS-7760
Legion:        76         104       28       Gigabyte G1.Assassin2
OCAHolic       74         149       75(!)    Asus P9X79 Deluxe
[H]           102         163       61(!)    Asus Rampage IV Extreme

As you can see, the outliers occur when a third party motherboard is used. I'd love to know what's on the Rampage IV that demands 61 extra watts :eek:, or what special sauce was put into that MSI board. Please do let us know if Asus somehow patches the power consumption at some point with a new BIOS.

Bottom line on power consumption for me is, while the load isn't attractive by any means, it's still somewhat competitive at idle depending on the motherboard.

There's nothing weird with it.

All that uber useless stuff that is put on high end mobos consumes energy there are also variances depending on how was VRM section designed as well as number of phases and the way the operate at idle:

idle comparision of few X79 mobos
pobor_idle.png



load
pobor_stress.png


MSI has mechanism to shut down power phases at idle to reduce power consumption.
 
Couple that with rather piss-poor showing for the PCI 3.0 lanes
Support is there it's just not ratified yet given the lack of supportive hardware.

The CPU and platform provide a decent upgrade over current 2500k/2600k Sandy Bridge or even older core i7 1366 offerings, but if you take into account the price tag then you're left with only a few people who could actually use this thing and be actually worth the $ they've paid for the upgrade.

Accepted but then you probably have many of those who say it's too much changing out their setup every time there is a refresh anyway, just not to this one. It's all relative. There are no doubt many who switched from core 2 to i7 to SB as soon as each came out, others who are still on Core 2 who may now go to SB-E. Who's made the most of the $'s they've paid for their upgrades?
 
Support is there it's just not ratified yet given the lack of supportive hardware

But that's why it's currently just a practically useless selling point feature. It's definitely something that I see all motherboards using, but with current graphics cards it's unnecessary.

Frankly, I'm not even sure when it'll become necessary. I am certain that it'll be a period of years.

Had they included what initially planned for the x79 chipset (the new i/o controller, sata, sas) then I think the reviews would be far more overwhelmingly positive because at least it bring something new and innovative to their high end desktop platform. I think offering all of the 2011 chips as unlockable would also have been a good move.

Currently you get PCI express 3.0 and 2 cores 4 threads more (or the same 4 cores 8 threads if you wait till next year for the ~$300 version, which btw will not be unlocked) with better memory bandwidth. That doesn't justify a >$600 upgrade to me and sentiment is likely shared among all but a very few potential consumers where it makes sense.

Either way, one thing's for certain: this most definitely isn't a gaming chip.

EDIT: I guess my biggest gripe is that a platform essentially meant for the high end workstations is being marketed as a gaming platform and CPU. Buying a $600 mobo/CPU to get roughly the same FPS on modern titles even with PCI 3.0 just doesn't make any sense whatsoever. Paying $600 for a platform upgrade if it means you'll see a productivity increase in work-related tasks, on the other hand, is completely understandable. But when you're forking over that much cash for a chip without cooling or even usb 3.0 then why not make them all unlocked? Bleh
 
Last edited:
The CPU and platform provide a decent upgrade over current 2500k/2600k Sandy Bridge or even older core i7 1366 offerings, but if you take into account the price tag then you're left with only a few people who could actually use this thing and be actually worth the $ they've paid for the upgrade. Unless, of course, you want to market it as a gaming chip and target a wider audience...

As someone who only uses desktops for "workstation" type demands, I'd have no issues paying for the cash for this platform. I use laptops for everything else (more than good enough even on ye old core 2) but this seemed like a good platform for some of the work related stuff I do.

Price tag really means fuck all though. Once you start factoring Quadro GPUs, SAS interface drives and the other things that go into such a platform you don't worry about things like "heat", "power consumption", "cost", or "portability" which are the main concern for most PC consumers. You get out there in idiot price land with idiotic components creating space heaters.

The thing is, they shot themselves in the foot by removing SAS from this chipset and limiting the amount of SATA 6gbs ports here. This is clearly a platform aimed at real power users on the workstation end, and SAS ports along with a metric ton of SATA ports count far more here than the ability to overclock the piss out of it.

I can see the logic with intels recent separation between workstation and desktop class sockets, it's two very different markets. The problem is that a lot of their promises for the chipset here, were just as much of a selling point as the CPUs threaded performance along with the quad channel memory (and even the value of quad channel on a workstation is doubious depending on what it is you do).

I'm not sure if these features might come back later on server/workstation boards later on, who knows. When they hit 8 cores it will be worth upgrading somethings, but as they are now with the neutered chipset it's become a "wait and see" situation.
 
As someone who only uses desktops for "workstation" type demands, I'd have no issues paying for the cash for this platform. I use laptops for everything else (more than good enough even on ye old core 2) but this seemed like a good platform for some of the work related stuff I do.

Price tag really means fuck all though. Once you start factoring Quadro GPUs, SAS interface drives and the other things that go into such a platform you don't worry about things like "heat", "power consumption", "cost", or "portability" which are the main concern for most PC consumers. You get out there in idiot price land with idiotic components creating space heaters.

The thing is, they shot themselves in the foot by removing SAS from this chipset and limiting the amount of SATA 6gbs ports here. This is clearly a platform aimed at real power users on the workstation end, and SAS ports along with a metric ton of SATA ports count far more here than the ability to overclock the piss out of it.

I can see the logic with intels recent separation between workstation and desktop class sockets, it's two very different markets. The problem is that a lot of their promises for the chipset here, were just as much of a selling point as the CPUs threaded performance along with the quad channel memory (and even the value of quad channel on a workstation is doubious depending on what it is you do).

I'm not sure if these features might come back later on server/workstation boards later on, who knows. When they hit 8 cores it will be worth upgrading somethings, but as they are now with the neutered chipset it's become a "wait and see" situation.

Apparently they were having issues with the controller so they decided to neuter it, as you put it. It edges out the 990x by a small margin on some benchmarks and the SB 2600k by a small margin in others, but not near enough to account for the price tag and lack of new features. I think we're in agreement here.

I think the decision that AMD made in providing all of their "enthusiast" chips with unlocked multipliers was very well thought out, even if they're lackluster in performance. You'd think Intel would at least follow along with their extreme gaming cpu /satire
 
Either way, one thing's for certain: this most definitely isn't a gaming chip.

It's a "no compromise" chip. Unlike bulldozer which brings not bad multithreaded performance with single threaded performance being a joke you get CPU with tons of multithreaded power without need to sacrifice single threaded performance.
 
After reading the results of Kyle's hard work and looking at the apples-to-apples benchmarks of other websites, I can only come to one conclusion about SB-E:

Why was this processor not called a Xeon?

It's as good at gaming as a $300 Core i7, no better or worse. It's great at workstation tasks, in spite of the chipset associated with it - which is pedestrian at best, barely an evolution over P67 but not quite as capable as Z68.

If I were Intel, I would have just held onto this silicon, added QuickSync, released it with a worthwhile chipset, and made the content creation/editing crowd happy before releasing Ivy Bridge to the enthusiasts and general desktop users. No reason to create false hope. That just tarnishes the brand.
 
Apparently they were having issues with the controller so they decided to neuter it, as you put it. It edges out the 990x by a small margin on some benchmarks and the SB 2600k by a small margin in others, but not near enough to account for the price tag and lack of new features. I think we're in agreement here.

I think the decision that AMD made in providing all of their "enthusiast" chips with unlocked multipliers was very well thought out, even if they're lackluster in performance. You'd think Intel would at least follow along with their extreme gaming cpu /satire

Yeah I know they were having issues. The thing is, marketing bullshit aside, x58 and now x79 were/are workstation platforms. You're not really seeing that much from those extra cores and the tripple/quad channel memory isn't worth a cup of warm piss for gaming and desktops. x58 did proper mullti-gpu though compared to p55 so there is that, but again, mutli-gpu is such a small fraction of people it's not worth getting all twisted over.

I'm not sure why the locked CPU, but in general you're not overclocking workstations, and from quad channel, to the power consumption, to what the chipset i/o should have supported this was obviously a workstation item to begin with. That intel just slapped "xtreme" on it, and as we all know to sell something to gamers just call it xtreme, slap some lights on it, hike the price by 20% and call it a day. We are also starting to hit the point where the amount of RAM supported is reaching the point where you'd want ECC or something like that, not overclocked RAM.

I think intel fucked up here. They screwed up the platform to the point where many of the items workstation users, the intended fucking target of this thing, are left scratching our collective heads about it. On the other hand for desktop users all this thing really does is drive up the power bill, it's completely absurd. This thing is just an odd chimera that while impressive on paper, completely drops the ball in it's intended market.
 
It's a "no compromise" chip. Unlike bulldozer which brings not bad multithreaded performance with single threaded performance being a joke you get CPU with tons of multithreaded power without need to sacrifice single threaded performance.

You need to look at the entire platform though. This is a true workstation chip with support for boat loads of quad channel memory and a chipset that they...... stripped out all the good workstation features from. :confused:
 
I think intel fucked up here. They screwed up the platform to the point where many of the items workstation users, the intended fucking target of this thing, are left scratching our collective heads about it. On the other hand for desktop users all this thing really does is drive up the power bill, it's completely absurd. This thing is just an odd chimera that while impressive on paper, completely drops the ball in it's intended market.

Couldn't agree more. Not sure what the execs at AMD and Intel are smoking lately. Maybe they are right in thinking *most* consumers are just morons and don't have a clue what it is they are buying. Maybe there is a huge disconnect between engineering and marketing and leadership.
 
Couldn't agree more. Not sure what the execs at AMD and Intel are smoking lately. Maybe they are right in thinking *most* consumers are just morons and don't have a clue what it is they are buying. Maybe there is a huge disconnect between engineering and marketing and leadership.

I think you hit the nail on the head. Both recent releases from AMD and Intel have been managed and directed poorly by marketing and PR. It seems as though Intel, after realizing they couldn't get their expected x79 chipset out and working properly, decided to just claim it's an extreme gaming chip on a new platform due to its damn near useless PCI 3.0 lanes. So now they have a platform and CPU that isn't worth upgrading from either their previous workstation 1366 nor actual gaming 1155 platforms.
 
Couldn't agree more. Not sure what the execs at AMD and Intel are smoking lately. Maybe they are right in thinking *most* consumers are just morons and don't have a clue what it is they are buying. Maybe there is a huge disconnect between engineering and marketing and leadership.

I can see the start of the logic. Quad channel and tripple channel memory are not really needed outside of workstation/server applications. Furthermore creating a platform and slapping 8 or more DIMM slots on it with the intended use of having a metric ton of memory in it is shooting for ECC memory once you hit that amount, again workstation and server applications. So it's obvious why they created different platforms, but then they dropped the ball halfway there and cut out all the cool stuff for the chipset.

I just really get the feeling they screwed the pooch somewhere along the lines and just released it anyways. Maybe we will get the full x79 platform we were promised when 8 core cpus start to come out with a revision?

They screwed up the p67 boards something horrid on release as well.

EDIT- most consumers, especially gamers are morons.

Cases in point- How many people end up buying high end ROG boards with tons of LN2 support and all sorts of things where you will not see any gains over another board unless it's on LN2 and then slapping an air cooler or liquid cooling on them after buying Dominator RAM which they can never fully clock? Answer, well, most of the people who buy ROG boards! This is driving up the price. Same ditto applies to all those custom GPUs out there. Or how many people bought a 1366 based board (maybe even a rog, classified, whatever) only to slap one GPU in the damn thing an SSD and a single HDD? Answer, a TON of them.

Gamers have pretty much proven that they will pay whatever the cost as long as you slap a fancy name on it, make it look spiffy, toss in some light, ratchet up the cost 20% and add some features they will never use. And they'll do it time and time again. I'm pretty sure everyone here has done something just as dumb at some point, I've done it.

Why do you think they keep trotting out extreme CPUs, ROG, Assassin, Classified, Matrix, Lightning, Fatality, and such products? People do pay extra for stuff they are never going to fully use and pay a massive premium for it if you call it gamer or extreme.
 
Last edited:
Nice review.

But I bought a Rampage IV + 3930k to use 4 GPUs (4X Lightning) with my Eyefinity set-up 3X30''.

I think there will be some benefits here compared to X58. Or else, at least, I will have fun building a new system. :)
 
nearly 4 years later and overclocked i7 920 users still do not have anything worthwhile to upgrade to.

pretty much this. but intel can afford to drop the ball, AMD is still trying to find out what end zone is theirs
 
Back
Top