Why must the i5 be so expensive?

There's something wrong with your rig then. I have an i7-860/560ti 448 and average around 70fps with everything maxed in Crysis 2. BF3 multiplayer I play with everything turned down, because I only care about frame rates.

Also, I've seen *plenty* of full-tilt 1.9 Crysis 2 videos on YouTube based around i5-K (either 2500K or 3570K) and single-GPU setups as plebian as HD6770 (the rebadged HD5770 that was replaced by the GCN-based, and cheaper, HD7750/HD7770, who also have their share of C2 show-offage on YouTube).

One other way Crysis 2 and BF3 differ (especially in terms of their YouTube videos) is that typically BF3 videos showcase multiplayer (which can introduce all sorts of outside-the-PC laggage issues into the mix), while Crysis 2 videos showcase the campaign mode - thus, Crysis 2 videos are far easier to compare, even with dissimilar rigs, than those for BF3 (or MW3 for that matter).
 
Also, I've seen *plenty* of full-tilt 1.9 Crysis 2 videos on YouTube based around i5-K (either 2500K or 3570K) and single-GPU setups as plebian as HD6770 (the rebadged HD5770 that was replaced by the GCN-based, and cheaper, HD7750/HD7770, who also have their share of C2 show-offage on YouTube).

The HD6770 isn't a HD5770 Rebadge. It's a slower card. Nvidia is the one who rebadges almost every generation.
 
Exactly you have a 7950, DejaWiz has a 570.

Look at BF3 charts sometime ;) A 7970 limits it's performance! NOT THE CPU if we're talking Quads which I believe we are here!

SO There is NO reason to blow 200-300 bucks on a CPU if you aren't willing to go SLI/CFX straight up, and simply purchase something above and beyond a 7970's performance for gaming initially. It's all wasted cash.

So you guys keep on preaching the go for a 200-300 dollar CPU, and get a GTX 480/560/570/580 or less in performance, and far weaker than even a 7970. It's simply useless when people drop that advice here, and I see it everyday from your crowd. Very annoying advice. Completely useless as well.

Or here's another one. Buy a 200-300 dollar CPU and don't get a SSD lawl.

I'll buy another CPU when it's *GASP* actually *useful and a bargain*. When I have a GPU that is beastlier than a 680GTX OC'd. Not to brag about a unrealistic benchmark score that has ZERO real world or NOTICEABLE performance because I slapped in a weak/cheap GPU, to a beast/expensive CPU combination that makes no noticeable difference in my games. You know it's a waste to do that, so why do it guys?

Some of us use computers for more than jerking off in BF3. If you really think your 560ti and Athlon x4 are competing with high end Intel and 7970s you're out of your mind. This isn't even an argument.
 
Last edited:
i5-3570k from microcenter for $180, for the performance I think I got a steal!
 
i5-3570k from microcenter for $180, for the performance I think I got a steal!

That thing is a 2500k on a natural test booster(not quite steroid level :)). OC it to 4.5ghz with good volts and you will be a happy camper with 97 or 98% of the performance of a 3770k! Good temps at the lower volts too, just dont go past 1.25 and you will be fine.
 
Who is TIM? If you meant the thermal material, who cares? It works well enough to OC nicely.

Well its because the volts/power usage are so amazingly low..but then the temperatures are higher than 1366 after 1.25 volts or maybe it was 1.35. My max temps were like 70 with 1.375 on my 1366..3770k's can get into the 90's with less than that :eek:
 
Fuck TIM, but other than that, Win! good deal.

It's already been proven that the temperatures have nothing to do with the TIM under the heatspreader. People have taken off the heatspreader and connected their heatsinks and waterblocks directly to the cores and temperatures are still much higher than sandy bridge.

It's a problem with the 3D tri-gate transistors, how they are arranged, and how closely they are packed together. There is not enough surface area or room for the heat to dissipate. Fluxless solder would only help by a couple degrees at best, but they run much, much more than a couple degrees hotter.
 
It's already been proven that the temperatures have nothing to do with the TIM under the heatspreader. People have taken off the heatspreader and connected their heatsinks and waterblocks directly to the cores and temperatures are still much higher than sandy bridge.

It's a problem with the 3D tri-gate transistors, how they are arranged, and how closely they are packed together. There is not enough surface area or room for the heat to dissipate. Fluxless solder would only help by a couple degrees at best, but they run much, much more than a couple degrees hotter.

People have also replaced the tim and put the heat spreader back on and saw significant improvement. Since when does one (or even two) peoples experiment with direct die contact constitute as definitive proof?
 
People have also replaced the tim and put the heat spreader back on and saw significant improvement. Since when does one (or even two) peoples experiment with direct die contact constitute as
definitive proof?

Logically it makes sense. If the spreader with stock material does not hurt temps compared to direct contact then it is not the material causing anything. One can however conclude that with a spreader and good material, you get better temps. You can't say that the stock material hurt them in this case.
 
Running a Core i3-2100 here. Blows my old Phenom II X4 955 BE out of the water, no questions asked.

I was an AMD hardcore fan, but now it's hard to justify buying it... and I don't think it's expensive, even considering I live in a third world shithole where we pay 3x more for anything, that's the best computer I've ever built.
 
Logically it makes sense. If the spreader with stock material does not hurt temps compared to direct contact then it is not the material causing anything. One can however conclude that with a spreader and good material, you get better temps. You can't say that the stock material hurt them in this case.

Seeing that there is, at the very least, just as much evidence to the contrary, you can't say the stock material didn't hurt them either. The heat sinks are designed to work with the IHS in tact. Removing it means you have to make modifications just to achieve proper contact with the die. I'd say the guys who put the IHS back on have far more validity than the one or two people who went direct die considering that you just added other variables that could affect the results.
 
I filled up the internal IHS with oil, and seal it up.

With the fluid convection going on inside as the die heats up, I'm effectively increasing the thermal dissipation efficiency of the IHS. Try it.
 
Running a Core i3-2100 here. Blows my old Phenom II X4 955 BE out of the water, no questions asked.

I was an AMD hardcore fan, but now it's hard to justify buying it... and I don't think it's expensive, even considering I live in a third world shithole where we pay 3x more for anything, that's the best computer I've ever built.

So seeing as how you pay three times as much as us where you live then you paid $357 for your i3? Good Grief! At those prices i'd still be on Socket 754.
 
It's already been proven that the temperatures have nothing to do with the TIM under the heatspreader. People have taken off the heatspreader and connected their heatsinks and waterblocks directly to the cores and temperatures are still much higher than sandy bridge.

It's a problem with the 3D tri-gate transistors, how they are arranged, and how closely they are packed together. There is not enough surface area or room for the heat to dissipate. Fluxless solder would only help by a couple degrees at best, but they run much, much more than a couple degrees hotter.
Yes. If Intel would put the inactive transistors between cores, that would help thermal dissipation greatly. Currently, one half of the die if dedicated to processing cores, and the other half the on die GPU.
 
Exactly you have a 7950, DejaWiz has a 570.

Look at BF3 charts sometime ;) A 7970 limits it's performance! NOT THE CPU if we're talking Quads which I believe we are here!

SO There is NO reason to blow 200-300 bucks on a CPU if you aren't willing to go SLI/CFX straight up, and simply purchase something above and beyond a 7970's performance for gaming initially. It's all wasted cash.

So you guys keep on preaching the go for a 200-300 dollar CPU, and get a GTX 480/560/570/580 or less in performance, and far weaker than even a 7970. It's simply useless when people drop that advice here, and I see it everyday from your crowd. Very annoying advice. Completely useless as well.

Or here's another one. Buy a 200-300 dollar CPU and don't get a SSD lawl.

I'll buy another CPU when it's *GASP* actually *useful and a bargain*. When I have a GPU that is beastlier than a 680GTX OC'd. Not to brag about a unrealistic benchmark score that has ZERO real world or NOTICEABLE performance because I slapped in a weak/cheap GPU, to a beast/expensive CPU combination that makes no noticeable difference in my games. You know it's a waste to do that, so why do it guys?

Teletran I love your posts and usually get a lot out of them just because it always seems you look at something completely different. I'm fairly confident I understand the point you are making, but it seems you are focused primarily on games and games alone. I wanted to add that comparing just the two systems in my sig that I can tell the i7 is faster/better. It's pretty much plain as day, and that was before I added the SSD, and overclocked it. Also, I just installed a SSD on my mother's PC and she's running a Phenom II 955 quad core CPU. Not overclocked. On a fresh install of Windows 7 I still notice my i7 is just a little bit quicker.

My point is outside of gaming there is a difference. Even with casual use, as minor as it might be, I notice it. That's not to say I couldn't easily live with my 720, or the 955. It's just that depending on the application these higher priced CPU's that benchmark better than AMD's counterparts are indeed faster.

The reason I originally (I use originally because I built my 930 machine before my 720) went with the i7 was for longevity. I wanted something that would last and two years ago socket AM3 looked to be on it's way out, while 1366 was still very strong. Had I've of known what I know now there's a very good chance I would have gone with AMD then. My i7 system may be faster than my AMD system but that AMD system would have EASILY gotten me by... Oh, and did I mention it was ~$1,000 less to build? 930 cost me ~$290 while the 720 was ~$80.

I love both my machines, very happy with their respective performance. I'm also glad I decided to build an AMD machine just so I could compare them side by side. If my Phenom II was paired with a 5850, a SSD and was overclocked to 4.0GHz as well who knows how much closer the real world speeds would be?
 
Running a Core i3-2100 here. Blows my old Phenom II X4 955 BE out of the water, no questions asked.

I was an AMD hardcore fan, but now it's hard to justify buying it... and I don't think it's expensive, even considering I live in a third world shithole where we pay 3x more for anything, that's the best computer I've ever built.

Ya know, it's funny. My i7 OC paired with the SSD and some other higher end parts is faster than my 720, but by no means do I think it "blows my Phenom II out of the water". In encoding and multithreaded apps, yeah, sure. But the differences I would expect as much.
 
Exactly you have a 7950, DejaWiz has a 570.

Look at BF3 charts sometime ;) A 7970 limits it's performance! NOT THE CPU if we're talking Quads which I believe we are here!

SO There is NO reason to blow 200-300 bucks on a CPU if you aren't willing to go SLI/CFX straight up, and simply purchase something above and beyond a 7970's performance for gaming initially. It's all wasted cash.

So you guys keep on preaching the go for a 200-300 dollar CPU, and get a GTX 480/560/570/580 or less in performance, and far weaker than even a 7970. It's simply useless when people drop that advice here, and I see it everyday from your crowd. Very annoying advice. Completely useless as well.

Or here's another one. Buy a 200-300 dollar CPU and don't get a SSD lawl.

I'll buy another CPU when it's *GASP* actually *useful and a bargain*. When I have a GPU that is beastlier than a 680GTX OC'd. Not to brag about a unrealistic benchmark score that has ZERO real world or NOTICEABLE performance because I slapped in a weak/cheap GPU, to a beast/expensive CPU combination that makes no noticeable difference in my games. You know it's a waste to do that, so why do it guys?

The beauty of dumping extra cash into the CPU is that the investment can be maximized as long as the software being used will take advantage of it. A better GPU can always be added at a later time and typically with much less hassle than that of switching out the CPU and in many cases, the MoBo, as well.

I agree with your logic that it's typically more important to spend higher amounts of money for the best GPU possible considering today's games, but that's only for a gaming-only PC. And BF3 is but one example of a game that can be very GPU-bound. But it's not the only game out there that everyone plays. It's not the only activity that people do with their computers. Hell, I've never played it and probably won't for at least a couple/few more years until it's on sale for $10 or less. Anyway, I think it's okay to recommend a weaker processor choice based on the example of one PC game if that one PC game is all that is played...But what about other games and software? It's foolish to see all the CPU and GPU performance proof and data right in front of you from a myriad of sources and keep going back to BF3 as a justifier for recommending an inferior processor.

As for getting a higher-end CPU and needing to go SLI/CFX, why? I have a single 24" 1080p monitor that I'm perfectly happy and content with. I don't want nor need a multi-monitor setup that would allow a current gen SLI or CFX configuration to be worth the cost. What's worth it to me is getting the most CPU now since I typically don't upgrade my processor and MoBo for 5-6 years, and skipping every other GPU generation. My 570 carried over from my old system and, as tempted as I am to get a 670 today, I'm going to wait for the next NV and AMD offerings because I'm not gaming 100% of the time and I'm not on my home computer 100% of my waking hours. I'm one of those computer users that does game a lot, but I also like to tinker around with non-gaming activities. I'm not the only one.
 
I filled up the internal IHS with oil, and seal it up.

With the fluid convection going on inside as the die heats up, I'm effectively increasing the thermal dissipation efficiency of the IHS. Try it.

Can't... tell if serious... or trolling... :confused:
 
The HD6770 isn't a HD5770 Rebadge. It's a slower card. Nvidia is the one who rebadges almost every generation.

Actually, there is one (and only one) difference between HD57xx and HD67xx - the BIOS. The GPUs *themselves* are otherwise identical. The BIOS change was to implement two features that even HD57xx supported - HD3D over HDMI (supported since HD5xxx - even HD5450 supports it) and a correction of a bug in several HD57xx cards (most notably XFX - however, even XFX was not alone).

AMD did, in fact, get whacked in the press (and here at [H]) for following NVidia in the rebadging game - however, these two GPUs were the *only* carryover/rebadges that AMD did with HD6xxx - the rest were new product from the ground up.

However, neither is a CPU, which was the point of the thread.
 
Get a 2500k and be set for at least another 2.5 - 3 yrs. They're not THAT expensive dude...
 
It's worth someone trying, if they could ensure no air, and actually seal it properly. ;)

I remember Tom's Hardware few years ago filled the acryl case with oil and using it as cooler. It did work :)
On the other hand [H] in one of their thermal components test used cheese.... wasn't working that good tho :)
 
I think it's because the AMD CPUs that have come out right now are a disappointment and Intel basically has a "monopoly" with their offerings. So Intel can happily set the prices they want to set, knowing people will buy their product due to the disappointing AMD CPUs. I checked the price of my I7 2600K and even after ONE year, the price hasn't changed one $$$, it's exactly the same. But the price of my memory dropped $20 ( which is so depressing ) -_-
 
i5 isn't that far off from the Top tier processors in most applications (I assume), $200 is a good price for a pretty good processor
 

Call me cynical...

... But having a phase changing liquid in a sealed cap (against PCB) sound like a terribly bad idea. I know... it's not the original intention of your post.

I don't think the oil will get hot enough to do this or even expand sufficiently to break the seal... but still. Also, the last I read... oil (at least mineral oil that people like to dunk their PC's in) is a terrible TIM.
 
The vast majority of heat pipes for low temperature applications use some combination of ammonia (213–373 K), alcohol (methanol (283–403 K) or ethanol (273–403 K)) or water (303–473 K) as working fluid.
from wikipedia.

That's what I thought was in them. I don't know where the idea of oil being inside heatpipes came from lol.
 
Ammonia and alcohols are soluble in water, being polar and all. Stop digging. :p

P.S. I5 is not expensive. It's just Intel has raised expectations by lowering price/performance tiers.
 
Back
Top