Intel Core i7-3960X - Sandy Bridge E Processor Review @ [H]

Sorry, but that's the truth.

My "for real work" desktop makes my SLI, raided SSD, 24gb RAM "gaming desktop for friends that want to use it" look like a fucking used Hundai.

Guess it depends on what you need to do for "real work"

For most "real work" in an Engineering/Manufacturing setting a Pentium II would probably be sufficient, provided it had enough RAM. It's just a bunch of web apps, PDF's and a little CAD work. Nothing that stresses even a 15 year old system particularly hard.

I know servers still use SAS (but I don't quite understand why, as the performance and stability seem similar, and the price is higher for SAS).

That being said, I can't help but wonder how many 3930K's and 3960X's were ever going to be used in a server setting. At best - from a "real work" perspective - they were intended for content creation work stations. I don't have much experience here, but IMHO I feel SAS would be a waste on these systems.

The Xeon's are what were intended for server parts, and they will likely be coupled with server motherboards, which I haven't even seen any of yet. Chances are they will have third party SAS chips on board just to satisfy their core user group.

I guess what I don't understand why you are raging so much of over a feature that can so easily be added through third party chips either on the motherboard, or in a PCIe slot.

It seems like such a non-issue when its so easy to solve through third party solutions.

Yes, for large deployments it is probably an issue as it adds some cost, but then again, x79 replaces x58 which also did not have native SAS support...
 
Thanks for the heads up Kyle on the CPU. I guess I'll wait to see if the K series will fare better, or even Ivy Bridge Ivy Bridge is compatible with the 2011 socket, right?
 
Thanks for the heads up Kyle on the CPU. I guess I'll wait to see if the K series will fare better, or even Ivy Bridge Ivy Bridge is compatible with the 2011 socket, right?

My understanding is that there will be an Ivy Bridge-E down the road that will be LGA2011, but that the first releases - just like with Sandy Bridge will be LGA1155 parts.

There is also never any guarantee that the motherboard manufacturer updates their BIOS for future CPU's, even if they are socket compatible.
 
Zarathustra[H];1038033274 said:
There is also never any guarantee that the motherboard manufacturer updates their BIOS for future CPU's, even if they are socket compatible.
Because IB-E supose to be coming in around 6 months, you can be pretty sure there will be a BIOS update, as the same board will be still on sale.
Next to that, the buyers of these type of boards, are not really forgiving, if they don't get after sale service, and will never buy from you again, and maybe even worse, will also advise all there friends to do the same.

So giving no new BIOS update, is the same as shooting them self in the foot.
 
Zarathustra[H];1038033285 said:
Those Intel people are complete douche bags, trying to intimidate honest review sites that don't just copy and paste their marketing drivel.
It is noting compared what nVidia has bin doing for years, giving cherry picked cards for reviews, and giving no cards to reviewers that tell the truth. (including Kyle in the past)

Compared to most tech firms, from what i hear, Intel is mild when it comes to strong arming review sites.

No that i think remarks made that way are ok, its just a shame but it's how the game is being played.:mad:

Its just a shame that not more sites dear to put there foot down, and say how it is.
But give in to the fair of not getting review samples the next time.
 
Zarathustra[H];1038020574 said:
Nice. I just decided to pick up an ASUS P9X79 WS. More money than I've EVER spent on a motherboard before (unless you count my old Shuttle SX58H7 XPC case as a motherboard.)

Just received it. Haven't opened the retail box yet, but MAN is it HEAVY compared to other mobos I've bought...

Maybe I was paying by the pound? :p
 
The answer is, I will buy it. Or rather my company will buy it. I work for a corporation, and I don't get to overclock my machines at work. However, compared to my salary or my hourly rate billed to a project the cost of a 1,000$ processor is small. CFD, FEA, FEM, I would love to have this kind of power. Hopefully I'll get one soon enough, this system has long since been long in the tooth.
 
The review summary pretty much summed up what I was thinking as I was looking over each benchmark. The only people I see buying this are people who encode movies like crazy and have money to blow.

The tri/quad SLI/CF benchmarks should be interesting though.
 
The review summary pretty much summed up what I was thinking as I was looking over each benchmark. The only people I see buying this are people who encode movies like crazy and have money to blow.

The tri/quad SLI/CF benchmarks should be interesting though.

Why so you can see the same thing the IPC benchmarks showed?
 
Just to add, I see Kyle's point. Really what would make this platform more compelling 'right now' is if we had a top tier game like BF3 get a scaleable patch that would benefit from added memory and the xtra 2 cores.

I believe that BF3 already supports 6 cores. I know that BF:BC2 did as well as was stated by Dice some time back and by my task manager going nuts with graphs spikes across all 6 of the cores on my 980x when playing either game...
 
... and by my task manager going nuts with graphs spikes across all 6 of the cores on my 980x when playing either game...

Unfortunately you can't trust this.

Windows task manager load balances tasks across cores several times a second, so even a game that is pinning only one core, will often look like it is evenly loading all of them because it switches between cores faster than the task manager can display it, so instead you get an "average load" on your core.

ZThis can be really confusing, cause in RO2 - for instance - I am CPU limited on my 4Ghz Phenom II x6, even though none of my cores ever loaded to max.
 
Zarathustra[H];1038036995 said:
Unfortunately you can't trust this.
Actually there is a dirty trick around this, and that keeping track of all the core temp's.

Measure different games (and OCCT for max load), ware from you know from, how many cores they use, use that as a baseline and you will know if a program is using all core's, or if windows is spreading it out.

I know its not 100% full proof, but its actually more accurate then most people think.
 
TL/DR: It was obvious that intel was full of shit and to see it's power you'd have to step above gaming. It's obvious that intel knows how to sell products to gamers, more cash than brains. Finally it's not like that haven't done this countless times before.

Yeah know, I think you have made your point. You don't need to be a dick about it.
 
I didn't feel like digging along through 15 pages of replies about this since it's sorta trivial, but with respect to the OMG factor around 135watts TDP I have to say "bullshit".

Yeah, 135 w would be a nasty step back if it were 4 C, but it's 6 C.

4 C @ 95 w TDP scaled to 4 C is 95w+47w= 142 w...so why the big OMG around 135 watts? Or isn't the power consumption supposed to scale by the number of cores?
 
At stock clock the wattage difference is linear wrt cores vs 2600K but if you OC the bad boy to the 4.8ghz range the power draw is now greater than linear wrt cores vs 2600K. [H] cares about OC and this thing is OMG not very impressive in that area.
 
! I hadn't considered the effect of OC activity. Good point and thank you!

Yeah, I've tinkered with my Q8400 enough to watch it go non-linear at extreme OC (and watched the heat sink/fan reach saturation), this slipped my mind.
 
Damn.

Looks like the best I can get in Crossfire with my two 3 slot cards on Asus p9x79 WS I bought is 8x-8x :(

Asus may be solid boards, but they have some of the worst slot layouts :(

I really hate it when some of these really important details are not available before buying and getting the damned thing home :(

P9X79-WS.jpg


Slots are as follows:

1: (blue connector) PCIe 16x (or 8x if slot 4 is in use)
(blank space)
2: (black connector) PCIe 8x
3: (white connector) PCIe 4x
4: (blue connector) PCIe 16x (or 8x if slot 1 is in use)
5: (white connector) PCIe 4x
6: (black connector) PCIe 8x

So I have a $400 board on a platform with 40 lanes to the CPU and 8 lanes to the chipset, and the best I can do with my 3 slot video cards is 8x-8x...

I can't even get 16x-8x...

meanwhile on my current MSI 990FXA-GD80, I am running the same video cards at 16x-16x...

This is very disappointing :(

Sure you can blame the unusual 3 slot design of the video cards, but they are an ASUS design, so you'd think they may have taken them into consideration when they design their motherboards so there's at least one board that can do a dual 3 slot video card properly...
 
According to the Asus website this wasn't hidden:

Expansion Slots: 2 x PCIe 3.0/2.0 x16 (dual x16, triple x16/8/8, quad x8, black+blue) *1
2 x PCIe 3.0/2.0 x16 (x4 mode, white) *1

The deluxe version isn't triple at 16 either:
3 x PCIe 3.0/2.0 x16 (dual x16 or x16, x8, x8) *1
1 x PCIe 3.0/2.0 x16 (x8 mode) *1
2 x PCIe 2.0 x1

The Rampage would do it though:
4 x PCIe 3.0/2.0 x16 *1
1 x PCIe 3.0/2.0 x16 (x8 mode, gray) *1
1 x PCIe 2.0 x1

http://www.asus.com/Motherboards/Intel_Socket_2011/Rampage_IV_ExtremeBATTLEFIELD_3/#specifications
 
According to the Asus website this wasn't hidden:

Expansion Slots: 2 x PCIe 3.0/2.0 x16 (dual x16, triple x16/8/8, quad x8, black+blue) *1
2 x PCIe 3.0/2.0 x16 (x4 mode, white) *1

The deluxe version isn't triple at 16 either:
3 x PCIe 3.0/2.0 x16 (dual x16 or x16, x8, x8) *1
1 x PCIe 3.0/2.0 x16 (x8 mode) *1
2 x PCIe 2.0 x1

The Rampage would do it though:
4 x PCIe 3.0/2.0 x16 *1
1 x PCIe 3.0/2.0 x16 (x8 mode, gray) *1
1 x PCIe 2.0 x1

http://www.asus.com/Motherboards/Intel_Socket_2011/Rampage_IV_ExtremeBATTLEFIELD_3/#specifications

I think you misunderstood me.

I'm not trying to run three cards.

I'm trying to run two cards, three slots wide each.

Unless I am misunderstanding the manual (which is very possible as this manual is just as crappy as most motherboard manuals) there is no way I can get these two cards to work better than 8x-8x.

If I had cards that were two slots wide it appears the best I could do would be 16x-8x, not the 16x-16x you suggest above.

Besides, I read through the entire description on Newegg, found the product web page at ASUS and neither had the info you posted above (which appears to be wrong according to my manual that came in the box). I even spent a good amount of time trying to find a PDF of the manual before ordering, to no avail.
 
Last edited:
Did you bother to click the link? That went straight to the Asus website so either their spec page is wrong or the owner's manual is wrong.
 
Confuses things a little that you're linking to RAMPAGE specs instead of this but the spec page for his mobo does say "dual x16, triple x16/8/8, quad x8" yet the manual says the 16x slots turn into 8x slots when both are in use, so, I guess you're right that one of the two must be wrong, i'm guessing it's the spec sheet that's wrong.
 
Hmm..I agree on the manual not being clear; thank you for this- I'd neglected to download the manual for the p8z68-v I'm considering.
 
Yeah, I think it's the specs written by marketing so you can interpret anyway you want and then find out the hard way you're wrong. Pretty crappy way to annoy your customers IMO.
 
Confuses things a little that you're linking to RAMPAGE specs instead of this but the spec page for his mobo does say "dual x16, triple x16/8/8, quad x8" yet the manual says the 16x slots turn into 8x slots when both are in use, so, I guess you're right that one of the two must be wrong, i'm guessing it's the spec sheet that's wrong.

That would be my guess as well, though I will be installing later today, and I'm HOPING its the manual that's wrong :p
 
Zarathustra[H];1038038412 said:
That would be my guess as well, though I will be installing later today, and I'm HOPING its the manual that's wrong :p
Actually, you might just be mistaken for assuming that "single 16x dual 8x" means dual 8x when both 16x slots are being used. The 16x slots could be multiplexed with the 8x slots and not each other such that if you use PCI1 and PCI2 then PCI1 becomes dual mode.
 
Actually, you might just be mistaken for assuming that "single 16x dual 8x" means dual 8x when both 16x slots are being used. The 16x slots could be multiplexed with the 8x slots and not each other such that if you use PCI1 and PCI2 then PCI1 becomes dual mode.

Could be! *Fingers crossed*

I just wish the Taiwanese wrote better English language manuals.
 
motheboards for the new 2011 socket are more interesting than the new processors itselfs imho.
 
I would love to buy one of the 3960X now but when I read that they are already 8 core with two core disabled. I think I am going to spend my winter skiing and since the Intructors season ends about March 15th I think I will wait for the for the other two core to be anabled before I spend my $1200(taxes included) on my X cpu. I already have it all ready to go a just need the water block and the cpu. I have the MSI BigBang ( was going to get the EVGA Clissified except the problems I have been having getting one of my vid card RMA 3 weeks and nothing untill I phoned Jacob and then is was 2 minutes and the RMA was done everthing is liquid cooled have XSCP triple fan triple core fore the cpu then to it run to a booster pump then a ThermalChill duel 120 fan Radiator then to a duel 140 triple core Coolgate radiator then back to the duel EVGA 580's colled by XSPC water blocks and then back to the triple fab radiator all in a single loop and it all fits perfect into a Antec GF-85.Power by a Silencer 950 MK II certified to 115 Amps( 1380 watts) I would have gone with the EVGA Classified motherboard if the RMA department did their job. I also run a RevoDrive3X2 480GB and four 74GB Raptors. I think the MSI Bigbang may be just as good except I love that 90 degree power connection that only EVGA has.
 
But let me say this, while Intel has been beating the drum about this being the "Ultimate Desktop Processor for Gamers," I think that is a lot of horse shit. This Sandy Bridge E is not going to do much anything for gamers if I am making the right guess based on what I have seen, possibly with one exception, and that is multi-GPU, multi-display gaming.
Well, yeah. And that's not a trivial consideration when using terms like "ultimate gaming."

I think this is entirely the reason Intel is marketing SB-E as the "ultimate gaming" part/platform -- it has little to do with SB-E's instruction core capabilities over SB CPUs in games, but rather the jump from 16 to 40 PCI-E lanes. A 2600K is a fantastic gaming CPU, and SB-E cannot do much better. Unless you are talking about "ultimate gaming" rigs that run triple and quad GPUs. A pair of GTX 590s or HD6990s on SB will only get x4 PCI-E per GPU. For "ultimate gaming" you need more PCI-E bandwidth than 16 lanes total. In this light, I do not see Intel's marketing of SB-E as "horse shit."

Given that, I feel this review of the i7-3960X was far too emotional. Kyle says that EE parts aren't worth it, and I agree wholeheartedly. Yet Kyle himself is running a 990X in his machine, so I guess they are not all worthy of such a hostile response.
 
Am I missing something a $300 or so CPU is extremely close to and even sometimes best the $1000 or so CPU.
Other thoughts
1. Triple channel memory was nearly worthless so let's put another worthless channel on it.
2. 40 lane PCI-E wow and we could not add that to the SB 1155 because?
3. Bulldozer was a joke lets copy AMD with socket 2011 but at a much higher price point
4. Hell I am playing skyrim on am AMD dual core CPU with DDR-2 memory at 1080p with almost max settings
5. Handbrake encodes my BluRays on my AMD 720BE and my uncles 980x in nearly the same time with in a few minutes. $1000 CPU vs a $300 CPU.
6. Hype is all hype nothing more
 
Am I missing something a $300 or so CPU is extremely close to and even sometimes best the $1000 or so CPU.
Not really, S2011 is only usefull for the most extreem users. (Tri/Quad CF/SLI)


1. Triple channel memory was nearly worthless so let's put another worthless channel on it.
True, but S2011 is original a server product, tweak for the extreme user, so why take it out if its all ready in, all you doing then is make a extra type of SKUs, and Intel had all ready to many of those imho.

Next to that, its actually the second main reason i want S2011, as 8 dimm slots makes it easy for a nice ramdisk in 64GB mem.

2. 40 lane PCI-E wow and we could not add that to the SB 1155 because?
S2011 is basically the same as S1155 (S1155 plus 2x dimm slots plus 24 PCIe lanes)
full.png


Even the X79 chipset is identical to the P67, its just relabeled, as the original planed Patsburg X79 chipset, that had native USB3, and more SATA3/SAS ports, was not finished in time!

3. Bulldozer was a joke lets copy AMD with socket 2011 but at a much higher price point
Actually, S2011 is a server product they just tweak a bit and resell it to people with to mouths money and/or no brain, as S2011 is only beneficial for 0.05% of the market, but properly sell to something like 0.5% of the market.

So whats so dumb from Intel, top relabel product, and sell it for a obnoxious amount of money to mostly morons that don't know better, and just want the best.

4. Hell I am playing skyrim on am AMD dual core CPU with DDR-2 memory at 1080p with almost max settings
That you are smart enough not to waist your money, dose not make S2011 a product whit no customer base.

5. Handbrake encodes my BluRays on my AMD 720BE and my uncles 980x in nearly the same time with in a few minutes. $1000 CPU vs a $300 CPU.
Thats just BS you talking there, as you must have different settings, as i do a extreme amount of recoding with Handbrake, and Handbrake scales perfectly with clock and the amount of cores.

6. Hype is all hype nothing more
And yeah i agree its mainly hype.

But i for one, am still getting a S2011 system, as i will make real use of it.

At the end of the day, yeah Sandy Bridge-E dose noting for 99.5% of the users, only heavy threaded users have benefits, and then only if time is money, and specially if you look at bang for buck, i agree, SB-E is a no go.

For the other sub 1% SB-E is something to look forward to.

Actually the quad ram bus is the main reason i am even considering getting a X79 SB-E combo, or even een SR-3 Xeon combo.
I want to fill them up with reasonable cheap 8/12x 8GB strips DDR3 1333 = 64/96GB memory, ware i can use 8GB for system memory, and the other 56/88GB to use for 10 to 20 times faster then SSD ramdisk.


I have now a heavy water cooled triple5870 2GB CF system with a i7 970, and for me the 64/96GB ram the extra PCIe lanes and the extra cores for video encoding would be a plus, for my next build, with tripe 7970 (or if possible maybe a GTX680) for 5 monitor Eyefinty setup

But hey building a PC like that, that is a hobby and have noting to do with common sense.

And for video encoders like me, yeah, 4 cores aren’t as good as 6, its literately a 50% speed boost.
I do a lot of video encoding with heavy post processing, and a 6 hour job would come down to a 4 hour job, what imho is a notable difference.

And i am even considering the SR-3 for dual octa core , and 12 dimm slots for 96GB of memory.

And yes i know its ridicules, but so is having a coin collection, I just have a lot more fun building my case and have more use of it, then just only looking at it.

It makes just as mouths sens to me as my Nissan R33 Skyline with 600HP, its fucking greasy, but hell those are the most fun things in life!

What I really would like to see is a comparison of SB and SB-E with dual/triple/quad SLI/CF
[....]

And yeah for most of the 99.5% of the users, the 2011 platform will be as useful as a Ferrari Enzo in a traffic jam!

I, myself, have a heavy water cooled P6TD with a i7 970, 24GB ram, 3x HD5870 2GB Matrix cards and a 240GB x2 RevoDrive 3, and yes i am literately using all 36 chipsets PCIe lanes.

Still, i am very interested in the S2011 platform to replace my S1366 system, for two reasons, and yes, i am not one of the standard users.

  • I wane use the extra PCIe lanes SB-E offers for Tri/Quad CF/(*SLI), And upgrade my 3 screen 5760x1200 Eyefinty for 5 screen 6000x1920 setup.
(*) A option "If" nVidia's Surround gaming will support 5 screen setup in there next 6xx line.
quad channel memory (only means anything for server)
  • Actually it's the main reason for me, why i am interested in S2011!, to be able to use 64GB of system memory, to make a RAMDrive (HowTo Video)

So, whats so fraking interesting about a RAMDisk you may ask.

Well if any of you noticed the difference between a HDD and a SSD, you will notice the same, or more, between a SSD and a RAMDisk!
And yeah 50GB/s is a hell of a lot faster then the 1.5/1.2GB/s R/W of my X2 RevoDrive 3!, and still even 33% faster then a S1366 ramdisk
Tho that are not real world Nr's, and even tho i have a the theoretic bandwidth of 38GB/s on my PC, according to windows resource monitor i get about 4~ 6GB/s real world transfer rates, but that's still about 10~15x faster then the Nr's i get from my 600 euro / 800 dollar 240GB PCIe SSD

Now with only 18GB of ram of the 24GB, for use as a ramdisk, ware also a portion is used for the swapfile(*), i have to often swap directories from HDD to RAM to HDD, with 64GB there will be no, or at least, a lot less need to do that.
(*)seriously i don't get why windows still needs one even if you have plenty of memory >_<

But even no its not perfect, its still a lot faster then any SSD, it just takes a bit of work making batch files for the programs you wane run on the RAMDisk.
But once you make a proper batch file, you can use it as a template for other programs.

And Robocopy is really your friend if you want to make batch files, to swap files between RAM and storage and visa versa.
 
Last edited:
What I want is for Dan to do a review with his monitors and quad 580/6970's and see a comparison at 7680x1600 between x79 and x58/regular sandy.

Even on low BF3 is choking on my new resolution. If SB-e could open it up I would go ahead and upgrade.
 
What I want is for Dan to do a review with his monitors and quad 580/6970's and see a comparison at 7680x1600 between x79 and x58/regular sandy.

Even on low BF3 is choking on my new resolution. If SB-e could open it up I would go ahead and upgrade.

Wouldn't you be held back at that resolution because of the 1.5gb of usable memory on your cards before your 930 @ 4.2Ghz would bottleneck? Honest question.
 
Hmm, why would this CPU good for games, we are stuck with shitty console port.
in general, what does that even mean? if anything many console ports need all the cpu power you can give them. but yeah this cpu will do nothing more for you than a 2600k when it comes to gaming on a single gpu.

in this review, it does seem to help in some games when using a gtx590 though. the difference over a 2700k in MW 3 and F1 is massive if their numbers are correct. http://www.hardwareheaven.com/revie...h-core-i7-3960x-processor-review-f1-2011.html
 
Last edited:
Thats just BS you talking there, as you must have different settings, as i do a extreme amount of recoding with Handbrake, and Handbrake scales perfectly with clock and the amount of cores.

my uncle bought the 990X a few months back so i came over with with one of my BluRay disk and used the same settings i use on my system. Quality left as 20 and DTS pass through for the audio. the end result was it finished less then 10mins faster then my system did. i was getting ready to blow a load on hard ware to reduce my encoding time and once i saw that i decided against it. once we saw that we decided to test his wife's computer that is a 2600K and it finished at about the same time. Not worth the upgrade. 3 computers same BluRay disk same standard handbrake settings. all 3 results were very very close.

Yes the 990X was faster but not enough to make me want one.

the following week i drove up to his house and we put the BluRay on the HDDs of all 3 computers and started all 3 computers at the same time the 990X finished first then the 2600k then my 720BE all of them were with in a few mins of each other. we kept the CPU monitor widget up and running. the interesting thing was 2600k was at 800% (4 cores 8 threads all at 100% load) the 720BE 300% (3 cores 100% load per core) the 990X 600% (6 cores 12 threads around 50% across all threads) handbrake not scaling well.

now my uncle bought the 990X because Face book was not running fast enough and the next day he put a GTX 580 in his computer just because he wanted to blow some fuses.

i am dead serious he was sold on a the 990X and GTX 580 and all he does is Facebook. lol.

i am hoping he buys the 3960X so i can see how it does with encoding BluRays.
 
Back
Top