Intel Core i3 vs. Core i5 vs. Core i7

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
The gang over at TechSpot have posted a comparison of the Intel Core i3, Core i5 and the Core i7 to give you an idea of what you get as you spend more on a CPU.

The Core i3 is intended as an entry-level option, the Core i5 is geared for mainstream usage, and the mighty Core i7 is meant for high-end systems and enthusiasts. Many will wonder which one is right for them. Do they need a Core i7 or will the Core i5 be just as fast for their needs? Should they spend less on a Core i3 and allocate the savings elsewhere, or is the Core i5 worth a premium?
 
Seeing the i3 do so well blew my tiny little mind. I knew the i7 in gaming wasn't vastly superior to the i5 but I had no idea the i3 did so well. I would really like to see the FX line up added to the test.
 
I'm curious about how DX12 changes the equation since it's supposed to utilize multiple cores better, up to 6 threads at least if the slides are to be trusted.

DX11 could very well be a bottleneck at this point.
 
I suspect if you test with a third-party card such as the 980 or 290 those numbers will be a LOT more spread out as you start to see what processors are capable of feeding the card fast enough.

But the i5 overall does much better than I expected. Probably why my 5 year old Mac Mini still feels fast to me.
 
Guess they dont want to reveal the truth about the pentium and celeron line. Yes that $50 chip is about the same as the $225 in games and most non rendering programs.
 
For most of us, this article is stating the bleeding obvious, but it's nice to see it anyway. I was disappointed they didn't go on to test the many-cored Xeons.
 
Can someone explain the performance difference in the Crysis 3 benchmark? The i5 and one of the i3s beat the i7, despite the i7 having both a clock-speed and cache advantage. I would suspect that hyper-threading is the reason for the discrepancy, but the i3s are utilizing hyper-threading as well...
 
Last edited:
Can someone explain the performance difference in the Crysis 3 benchmark? The i5 and one of the i3s beat the i7, despite the i7 having both a clock-speed and cache advantage. I would suspect that hyper-threading is the reason for the discrepancy, but the i3s are utilizing hyper-threading as well...

They are all less than a single fps different. Well within margin of error for testing. I would look at that chart more as dead even.
 
I'm curious about how DX12 changes the equation since it's supposed to utilize multiple cores better, up to 6 threads at least if the slides are to be trusted.

DX11 could very well be a bottleneck at this point.

That's based on a popular misconception. DX12 is designed to use less CPU cycles for GPU management activities, meaning that games with similar CPU usage for physics and other non-graphics tasks will be less CPU-bound overall and run BETTER on low-end processors.

Yes, it does spread the work out to multiple threads as well, but it's mostly about reducing overhead. The only reason we'd need more CPU performance in games is if developers write future games that use more CPU power (and do it in a thread-aware manner).
 
This oldie TBG i3 vs i5 vs i7 its much better in gaming oriented content.. and also have included the almighty G3258 I never trust in TechSpot charts..
 
Seeing the i3 do so well blew my tiny little mind. I knew the i7 in gaming wasn't vastly superior to the i5 but I had no idea the i3 did so well. I would really like to see the FX line up added to the test.

I have a sneaky suspicion we'd see the top end FX trail the lowest end i3 in every gaming test.

In the winzip/excel/encoding/rendering benchmarks where mutliple cores are used more effectively, and the CPU can run free independent of GPU limitations I think top end FX chips would be roughly on par with the i5.

These are just educated guesses though.
 
The only reason we'd need more CPU performance in games is if developers write future games that use more CPU power (and do it in a thread-aware manner).

Or, if gaming becomes more prevalent on the low wattage CPU cores, like in laptops. :p
 
That's based on a popular misconception. DX12 is designed to use less CPU cycles for GPU management activities, meaning that games with similar CPU usage for physics and other non-graphics tasks will be less CPU-bound overall and run BETTER on low-end processors.

Yes, it does spread the work out to multiple threads as well, but it's mostly about reducing overhead. The only reason we'd need more CPU performance in games is if developers write future games that use more CPU power (and do it in a thread-aware manner).

I'm sure the overhead reductions are great, but that would translate to future games finding more things to do with the saved computing power. I doubt the freed up compute muscle will just go unused by developers once DX12 establishes its foothold in the market.
 
I'm sure the overhead reductions are great, but that would translate to future games finding more things to do with the saved computing power. I doubt the freed up compute muscle will just go unused by developers once DX12 establishes its foothold in the market.

Well, it's not like they are using all the CPU resources available to them as it stands.

Very few modern games are CPU limited.
 
Soooo for gaming, i3. For multitasking (games + some apps) i5. For ego and/or encoding the i7. Goes to show that applications just haven't needed any excess horsepower for a long time now.

Take for example some of the Adobe tests (the one where the i7 led). Sure, the numbers are higher (better), but outside of encoding - is it a perceivable difference?

Not worth the difference in price. When checking processors out, I always check cpubenchmark.net .. Kinda allows you to ballpark their performance versus old processors.
 
Pff, what next they'll show AMD is as good as Intel?!

Would of been nice if they used slightly more CPU intensive games...
 
Hasn't this been known for pretty much ever? At least since the core iX series has been out anyway...

That's why the recommendations for gaming rigs have always been go for the i5 with the best video card you can afford rather than spending the extra money on the i7 unless you are using it for some work style data that does well with hyperthreading. Sure, the i7 is the superior processor, but when it comes to budget questions the better value is usually the i5+more video card.
 
Now you know why an 8350 does so poorly today when so few applications care about multiple cores. An i3 will get you a lot of gaming for the price.
 
Soooo for gaming, i3. For multitasking (games + some apps) i5. For ego and/or encoding the i7. Goes to show that applications just haven't needed any excess horsepower for a long time now.

My opinion, is that CPU innovation has slightly stagnated because it's no longer the bottleneck. Today's most common bottlenecks to PCs is the mechanical hard drive. Hopefully, though I don't expect to see a day/night change, when I/O stops becoming the bottleneck, CPU innovation will pick back up....though there are a multitude of other reasons going on I'm sure. I expect 2015-2016 to be when we start seeing widespread, mainstream adoption of SSDs.
 
They completely ignored the additional hardware features in the i7 that support hardware virtualization. If you're going to run VMs for any reason, the i7 could be a far better choice. They could have fired up a few VMs on the test systems and re-run a couple of their benchmarks inside the VMs concurrently, to show any differences there.
 
Hasn't this been known for pretty much ever? At least since the core iX series has been out anyway...

That's why the recommendations for gaming rigs have always been go for the i5 with the best video card you can afford rather than spending the extra money on the i7 unless you are using it for some work style data that does well with hyperthreading. Sure, the i7 is the superior processor, but when it comes to budget questions the better value is usually the i5+more video card.

That might be the recommendation, but every friend I have, and friends of friends I meet, continue to march on down to Frys/Microcenter and buy a range-topping i7, an AIO watercooler, a fancy motherboard......and a midrange GPU.
 
That might be the recommendation, but every friend I have, and friends of friends I meet, continue to march on down to Frys/Microcenter and buy a range-topping i7, an AIO watercooler, a fancy motherboard......and a midrange GPU.

At MicroCenter pricing, with their combo discounts, you might as well :D
 
Zarathustra[H];1041530313 said:
Well, it's not like they are using all the CPU resources available to them as it stands.

Very few modern games are CPU limited.

I'm thinking it was probably a chicken and egg proposition, but we'll see in a few years I guess. I'm betting big name game engines will change their underlying code drastically in response. I hope it means more budget allocated to better AI and positional audio.
 
I've seen an Intel i3 beat an AMD 8350 in certain games. Not may but enough to to a double take. The point being the i3 should not be beating AMD's top of the line cpu in anything for goodness sake. Don't get me started on the difference in power usage...blah.
 
Can someone explain the performance difference in the Crysis 3 benchmark? The i5 and one of the i3s beat the i7, despite the i7 having both a clock-speed and cache advantage. I would suspect that hyper-threading is the reason for the discrepancy, but the i3s are utilizing hyper-threading as well...

They don't go into details and I don't have Crysis 3 myself but here are two WAGs:

1. One or more cores on the i7 is overheating, thus throttling the whole CPU.

2. The threads are thrashing the CPU cache on the i7 but not on the i3 or i5.
 
I've read this article but will ignore the logic and likely continue buying i7 CPUs even though I mostly game; those i3 CPUs are very nice, though. I helped my wife's boss pick out a name-brand desktop PC that has an i3 and it's snappy, especially for what basic tasks they do with it. I imagine that setup should last them a long time before it begins to buckle under the weight of modern software demands.

I just can't imagine having anything less than four cores for gaming. I have Task Manager and other monitoring tools up in a secondary monitor so I watch GPU and CPU usage and modern games seem to love to spread out across cores.
 
. I have Task Manager and other monitoring tools up in a secondary monitor so I watch GPU and CPU usage and modern games seem to love to spread out across cores.

This is not a very accurate way to assess muktithreading capability as threads can alternate very quickly between cores so it looks like they are spread out when in reality you are seeing the one second average, and it was actually pinned at 100% followed by 0% in rapid succession from core to core and gets reported as - for example - 35% on each core.
 
Two fast cores are always going to be better than 4 (or 8 as AMD says). I just upgraded my parents from an i7 920 to a Pentium anniversary setup. Same hard drives, RAM, video card. They noticed a big jump in "snappiness" as they put it.

Most people are better off with Pentiums and i3's unless they have a specific niche (gaming, rendering, VM's etc.)
 
I'm curious about how DX12 changes the equation since it's supposed to utilize multiple cores better, up to 6 threads at least if the slides are to be trusted.

DX11 could very well be a bottleneck at this point.

This precisely. Some games stuttered on my G3258 before going 4790k.
 
Zarathustra[H];1041530165 said:
I have a sneaky suspicion we'd see the top end FX trail the lowest end i3 in every gaming test.

This gets said a lot and people selectively pull out synthetic benchmarks to prove the point... The reality is, AMD's CPUs have been known to perform better in a fair number of real use situations.

AMD chips trade blows with chips costing more than twice as much https://www.youtube.com/watch?v=eu8Sekdb-IE (TekSyndicate Video) with several tests the AMD outright smoking and in others within a couple percentage points of the i5s.

Synthetic benchmarks barely tell half the story.

I look at it this way: With CPUs having reached a point to where nearly every model of AM3+, FM2 socket and i3, i5, i7 CPU does more than enough when coupled with a good video card (where the processing power is actually needed and used)... I have no problem buying an AMD CPU for half the price and putting the price difference towards a better video card or doubling the ram in a build.

I'm currently using a dualcore i3 (branded as Celeron) in a Chromebook and nothing really lags it for what I use the Chromebook for as my daily driver. My much more powerful desktop (Phenom II x6) rarely gets used. Games and Video production stuff only.

People, maybe out of habit, continue the pissing contest that permeates the PC enthusiast forums... but due to where we are at with CPU performance in general, the only real deciding factor is now cost for me. I've found that this is a growing sentiment in general.
 
This gets said a lot and people selectively pull out synthetic benchmarks to prove the point... The reality is, AMD's CPUs have been known to perform better in a fair number of real use situations.

AMD chips trade blows with chips costing more than twice as much https://www.youtube.com/watch?v=eu8Sekdb-IE (TekSyndicate Video) with several tests the AMD outright smoking and in others within a couple percentage points of the i5s.

Synthetic benchmarks barely tell half the story.

I look at it this way: With CPUs having reached a point to where nearly every model of AM3+, FM2 socket and i3, i5, i7 CPU does more than enough when coupled with a good video card (where the processing power is actually needed and used)... I have no problem buying an AMD CPU for half the price and putting the price difference towards a better video card or doubling the ram in a build.

I'm currently using a dualcore i3 (branded as Celeron) in a Chromebook and nothing really lags it for what I use the Chromebook for as my daily driver. My much more powerful desktop (Phenom II x6) rarely gets used. Games and Video production stuff only.

People, maybe out of habit, continue the pissing contest that permeates the PC enthusiast forums... but due to where we are at with CPU performance in general, the only real deciding factor is now cost for me. I've found that this is a growing sentiment in general.

Do you have a 2955U as well? I really kind of like this cpu. Works fantastically in the chromebook.
 
Do you have a 2955U as well? I really kind of like this cpu. Works fantastically in the chromebook.

Mine's the Celeron 847. Sandy Bridge based i3 with a touch less cache and no hyperthreading. Runs great. Even more so since I put 10GB Ram in my Chromebook.
 
Ah, mine is haswell based. My chromebook gets me 6-8 hours even though it's about 1-1.5 years old. Only $330'ish and has 200M cellular free for life from t-mobile. Can't beat taking that when I go on my day long motorcycle trips & need quick web access via cell.
 
I've seen an Intel i3 beat an AMD 8350 in certain games. Not may but enough to to a double take. The point being the i3 should not be beating AMD's top of the line cpu in anything for goodness sake. Don't get me started on the difference in power usage...blah.
The i7 is not always faster than the i5 but nobody bats an eye.
 
The i7 is not always faster than the i5 but nobody bats an eye.

Yeah, I mean, the only real difference between an i7 and an i5 is HT.

Some applications and games make good use of HT, others don't.

If I were building a rig with what I know today, I'm not sure I would spend the extra money on an i7 over an i5.

(And I certainly wouldn't spend the extra money on a -E variant as I did in my foolishness in 2011 when I was upset over the disappointing Bulldozer launch I had been waiting WAY too long for to upgrade.)
 
Zarathustra[H];1041532095 said:
Yeah, I mean, the only real difference between an i7 and an i5 is HT.

Hardware virtualization support is also better with the i7. It makes a huge difference running even one VM, let alone more than one. A fast i7 can run a decent game server in a VM and still let you play the game on the computer, depending on the game and if the VM is configured properly.
 
The i7 is not always faster than the i5 but nobody bats an eye.

I never recommend the i7 over the i5 for gaming build threads in General Hardware, and neither does Dangman. In fact the i3 is typically the recommended system for non-hardcore gamers, and the i5 is the recommended step-up for more intense gamers. If they want cheap and to overclock, we either tell them Pentium Anniversary or the GPU-less Kaveri.

Given AMD's retarded pricing until the last few months, recommending anything other than GPU-less Kaveri has been near-impossible. Charging a premium for a feature that should be "free" (i.e. most users don't care about "higher" performance) is just pissing off enthusiasts, and since OEMs aren't using the processors either, where the hell else are they going to get business?

Only OEMs with an eye for product design/filling a niche give a shit about high integrated GPU performance. And most of those also care about efficiency, so it gets filled by Intel's GT3/GT3e, or they compromise and use Nvidia's Optimus.
 
Last edited:
Just went through my first upgrade aside from video in years (used to upgrade every few months just because). I went from an X58 - 12GB - W3450 Xeon - GTX970 to a Z97 - 16GB - 4690K - GTX970 and the performance is night and day. I'm quite impressed by the i5. I mostly play games, virtualize a TINY bit, and do my circuit designs and PCB layouts on it. It absolutely flies. I'm sure the virtualization and CAD work might go slightly faster on the 4790K than my 4690K, but not nearly enough for me to actually care. So, yeah, the i5 is nice. :D
 
Hardware virtualization support is also better with the i7. It makes a huge difference running even one VM, let alone more than one. A fast i7 can run a decent game server in a VM and still let you play the game on the computer, depending on the game and if the VM is configured properly.


In what way are they different in this regard? They both have VT-x, right? And all non-K CPU's support have VT-d (though it might be difficult to find a motherboard that supports it)
 
Back
Top