Intel Dual-Core Waxes Intel Quad-Core (and I eat crow)

PGHammer

2[H]4U
Joined
Oct 8, 2002
Messages
3,315
Mom's new Pavilion 23 showed up this afternoon, and - except for one goof by HP - is everything it appeared to be, and then some, and is making me eat crow in the process.

The goof (by HP) is including the wrong "keyboard" - the keyboard they forgot was the one with the included LeapMotion and can turn any flat-panel LCD display into a touch-capable display.

The feathers (and I'm still picking them out of my teeth) come from Intel's Pentium G3220T. Unlike the G3220 (no T), the T is aimed at touch-screen deployments - Ultrabooks, UltraTablets (tablets based on Ultrabooks) and desktop AIOs. Like the G3220, it shares lineage with that "other" Haswell stablemate - the Pentium G3258, AKA Pentium Anniversary Edition - the only difference being a locked multiplier.

It still didn't stop it from laying the smackdown on my locked (by the motherboard's G41 chipset) Intel Core 2 Quad Q6600.

One did NOT have more RAM than the other - both have 4 GB of system memory. Only type and memory SPEED differed.

Both are running 8.1 (the quad-core is running 8.1 ProWMC) - still, largely apples to apples (as the quad-core doesn't support Hyper-V; the Baby Pentium does, but it isn't installed).

Yet here I sit, spitting out feathers.

This isn't P4 losing to Q6600 - this is worse.

This is Q6600 losing - to a bottom-end (albeit Intel current-generation) dual-core.

Losing to i3 (or higher) would be easier to swallow; losing to G3220 (T or no T) is much harder to wash down.
 
Well, it is a lot faster clock for clock.. has a higher stock speed, has an on-die memory controller, and is using faster RAM. Not too surprising.

You could to a quick little pad mod on your Q6600 to up it to 1333 bus speed and have it run at 3ghz. It should do it fine at stock voltage, so you don't have to change anything.

http://www.overclockers.com/intel-lga775-pad-modding/

I did this for a Q6600 on a Dell Optiplex 780. Took all of about 5 minutes.

I also didn't make a shield/guide for the pad mod. I just used a tooth-pick to draw the conductive line.
 
It's all fast till you rip out some benchmarks, software and games that take use of the 4 cores (Q6600 at stock is shit, bump that bitch up to at least 3.2 then we talking). Not sure if the G3220 has HT.
 
It's all fast till you rip out some benchmarks, software and games that take use of the 4 cores (Q6600 at stock is shit, bump that bitch up to at least 3.2 then we talking). Not sure if the G3220 has HT.

Nix to HT - however, the Q6600 doesn't have that, either.
The apps that are multicore DO run better on even a Q6600; however, how many are there?
Hey - the very reason I compared the two heads-up is that both are locked down (albeit for different reasons; the G3220T ships with a locked multiplier, while the Q6600 is mounted to the Intel G41 chipset - which is locked down due to it being consumer-stable/corporate-stable - reliable as concrete, and just as overclockable - none, and by design). In other words, the ONLY advantages were more cores AND more cache (both on a per-core basis and, of course, overall) - and Kentsfield had both - and STILL lost to Baby Haswell.

Stock vs. overclocked is apples vs. oranges - we at [H] know that much. Other than RAM speed and HDD size (neither HDD is all that fast - in fact, they are rated identically in the speed department) the loadouts are identical. I didn't expect the old Q-ship to lose at ALL to the newer-tech dual-core. However, running everyday software - the sort of software that most folks STILL run on a daily basis - the older quad-core got curb-stomped. Not fun to watch. Not one bit. (And, as I said earlier, it's much worse when it's YOUR quad-core being curb-stomped.)

Now, think about how much worse it would be pitting even i3, let alone i5-K - against the SAME Q66xx - or any other LGA775 quad-core?
 
Is this a joke thread? I don't understand the point , the logic , or the OP's deduction from what he experienced.
 
Mom's new Pavilion 23 showed up this afternoon, and - except for one goof by HP - is everything it appeared to be, and then some, and is making me eat crow in the process.

The goof (by HP) is including the wrong "keyboard" - the keyboard they forgot was the one with the included LeapMotion and can turn any flat-panel LCD display into a touch-capable display.

The feathers (and I'm still picking them out of my teeth) come from Intel's Pentium G3220T. Unlike the G3220 (no T), the T is aimed at touch-screen deployments - Ultrabooks, UltraTablets (tablets based on Ultrabooks) and desktop AIOs. Like the G3220, it shares lineage with that "other" Haswell stablemate - the Pentium G3258, AKA Pentium Anniversary Edition - the only difference being a locked multiplier.

It still didn't stop it from laying the smackdown on my locked (by the motherboard's G41 chipset) Intel Core 2 Quad Q6600.

One did NOT have more RAM than the other - both have 4 GB of system memory. Only type and memory SPEED differed.

Both are running 8.1 (the quad-core is running 8.1 ProWMC) - still, largely apples to apples (as the quad-core doesn't support Hyper-V; the Baby Pentium does, but it isn't installed).

Yet here I sit, spitting out feathers.

This isn't P4 losing to Q6600 - this is worse.

This is Q6600 losing - to a bottom-end (albeit Intel current-generation) dual-core.

Losing to i3 (or higher) would be easier to swallow; losing to G3220 (T or no T) is much harder to wash down.

Comparing CPU's that are literally generations apart. I am not surprised in the least bit. I dont think anyone who collows hardware closely is. We would have expected it.

Oh, and G3220 is not even comparable to a P4.
 
Comparing CPU's that are literally generations apart. I am not surprised in the least bit. I dont think anyone who collows hardware closely is. We would have expected it.

Oh, and G3220 is not even comparable to a P4.

My point was an Intel dual-core stomped the same company's seminal original quad - and the dual-core doing the stomping is basically bottom-end.

This is NOT Intel vs. AMD - but Intel vs. Intel.

What it does is not JUST illustrate how much technology has improved, but how little software (even all too much of the software of today) utilizes multicore.

And I was comparing Q6600 to the P4 - not the other way around.
 
My point was an Intel dual-core stomped the same company's seminal original quad - and the dual-core doing the stomping is basically bottom-end.

This is NOT Intel vs. AMD - but Intel vs. Intel.

What it does is not JUST illustrate how much technology has improved, but how little software (even all too much of the software of today) utilizes multicore.

And I was comparing Q6600 to the P4 - not the other way around.

Yes but you don't seem to understand that because the Q6600 is years and years old , this dual core pentium you used is much newer tech , newer instruction sets , different chipset , motherboard and probably some other stuff.

A more accurate assessment would be that the Q6600 platform has succumbed to the dual core pentium you were using. Performance isn't just about the CPU, it's the system as a whole.
 
I remember being floored when a 35w dual core sandy bridge went toe to toe with my Q6600 way back when the Thinkpad T420 was new.
 
This isn't the gigahertz wars anymore. Though those were fun times :D Remember when [H] was so happy when they pushed a celeron slot A 233 into 800mhz?



Slot1_Celeron_1.jpg



procesador-intel-celeron-366-mas-placa-adaptadora-de-slot-1-4200-MLA2859436974_062012-F.jpg
 
My point was an Intel dual-core stomped the same company's seminal original quad - and the dual-core doing the stomping is basically bottom-end.

This is NOT Intel vs. AMD - but Intel vs. Intel.

What it does is not JUST illustrate how much technology has improved, but how little software (even all too much of the software of today) utilizes multicore.

And I was comparing Q6600 to the P4 - not the other way around.

I could have told you long before this comparison that most desktop software doesn't use more than 2 threads - 1 for the GUI, 1 for background tasks is usually sufficient to do anything the average user could ever want to do.

Once you take the number of cores out of the equation, what you have left is clock speeds and IPC...which, when comparing a modern generation 3220T to a decrepit Q6600, is a no-brainer frankly.
 
I have an E6600 in one, an E6850 in the 2nd, and an E8600 in the 3rd. All are just great.

You would think so - until newer CPUs (and their performance) smacks you in the face. I thought the same, and you have three dual-cores (of the same generation as Q6600). I have simply decided to get off the barge and stop being Pharonic (or worse, the Queen of De-Nile). Being Pharonic, now that things are improving money-wise, does nobody - least of all me - ANY good.
 
Because it's Haswell vs. Kentsfield - which is newer than Penryn.

Q6600 is still 65nm and was not a very good CPU. It was way too popular on this forum compared to what it deserved. The G0 stepping did overclock well but you actually needed to push it to make buying that CPU worth it, and it was quite the space heater when that was done. Yorkfield, the 45nm C2Q, was the first quad-core that I personally recommended for general usage.

At that time I advised going with dual-core 45nm C2Ds over the Q6600, and I would still advise using a dual-core 45nm over the Q6600 today to someone looking to buy a cheap used setup. At least for most purposes.

For most consumer workloads, even today, single-thread performance is still MUCH more important than number of cores. If they were still selling dual-core CPUs with significantly higher speeds than quads, I'd likely still be recommending most people buy dual-core CPUs today. However they managed to bring clockspeeds on quads up well enough (starting with Yorkfield) that it is worth buying quad (or more) for a lot more people.

I'd take a 5GHz dual-core CPU over a 3GHz quad-core if the IPC were equal between the 2. That comparison doesn't consider overclocking.
 
Is this a joke thread? I don't understand the point , the logic , or the OP's deduction from what he experienced.
I think part of it comes from the complaints every generation that there are only "small" IPC improvements, added in with the relatively static clock speeds.

In reality there has been steady progress and comparing CPUs a few tick-tock cycles apart highlights that fact.
 
I stayed on my beat to hell Q6600 until I built my first Z97 4790k rig. My Q6600 spent years at 3.8ghz on water. And would bench, fold and prime at 4.0ghz. The GO was a wonderful cpu and great overclocker. No way I'm going to key chain that cpu.

But to make this comparison at stock speeds and knowing the newer platform has better sata support among many other things is kind of warped.
 
Hey, I loved my Q6600. Still do. I had mine at 3.4ghz for years, starting in 2008. Then, in 2013, I finally made the leap to the 4770k. Funny thing was, it wasn't gaming performance that made me want to do it. I wanted more speed in Lightroom.

I expected a boost, sure. But holy crap, I did not expect the next level performance difference I got with my 4770k. 18mos later, I still think my i7 is blazing fast, and every time I have to use the Q6600, I feel like I'm in slow motion.

I'm sure the same sensation was felt even with the 'lowly' Pentium dual-core.
 
One effect that multi-core CPUs have had on the industry, is that even though most software is still depressingly single-threaded, more and more programs have become lazy when it comes to tasks that can peg a single CPU core. It's not uncommon now to see an older Dual-Core computer pegged at 100% CPU due to there being at least two processes in the background that are maxing out a core. Even something like windows update, or a virus scanner scanning in the background can max out a CPU core.

I feel that quad-core CPUs, even older quad-core CPUs, still make great office systems. Any situation where the tasks being performed aren't particularly limited by the slower individual CPU-cores, but DO benefit from there almost always being an unused CPU core available, regardless of what other background tasks happen to be occurring at that time.

Over time, there will only be an increasing number of background tasks that will expect to have nearly infinite processor power at their disposal. I really would not want to be in a situation where I only have two cores, having to share those two cores between the app/game I'm actually trying to run, and all the crap running in the background.
 
Don't feel bad. My new G3258 beats the Phenom II X4 980 it replaced in most of my games at stock. Six year old AMD architecture in a quad can't complete with new Intel in a dual core in gaming.
 
Don't feel bad. My new G3258 beats the Phenom II X4 980 it replaced in most of my games at stock. Six year old AMD architecture in a quad can't complete with new Intel in a dual core in gaming.

I don't know if that is a good comparison really, most gamers probably would have chosen an Intel dual-core over that AMD quad even when that AMD quad was brand new.
 
I think part of it comes from the complaints every generation that there are only "small" IPC improvements, added in with the relatively static clock speeds.

In reality there has been steady progress and comparing CPUs a few tick-tock cycles apart highlights that fact.

Pretty much the case - as my comparison, strange as it sounds, does illustrate.

A little improvement here and there adds up - and the entire path between Kentsfield and Haswell is full of little improvements. (Can anyone name so much as ONE major improvement along the way from Kentsfield to Haswell outside of overclockability?)

Besides, there WAS one other reason I made that particular comparison - what is the price gap between used Q6600 and new G3220 today?
 
(Can anyone name so much as ONE major improvement along the way from Kentsfield to Haswell outside of overclockability?)

iGPU. It's one of the major contributing factors for why you can now buy a full PC for under $300 (and in some cases, under $200). Being able to cram a GPU on to a CPU and still hand a 3 generation older CPU it's ass in benchmarks is pretty impressive.
 
Current-get CPU using a modern chipset and DDR3 beats a 8 year old CPU using a POS chipset (G41, really?!) and DDR2.


Is that supposed to be some sort of ground-shaking, world-changing revelation? You don't seem to know much about computers and technology, apparently.

:rolleyes:
 
OP - try something like Handbrake that will use all the cores you give it. Your quad core will probably come out ahead (although I'm just speculating.) But, everyone else is right - you shouldn't compare really old tech against new tech. It's not a fair fight.
 
Current-get CPU using a modern chipset and DDR3 beats a 8 year old CPU using a POS chipset (G41, really?!) and DDR2.


Is that supposed to be some sort of ground-shaking, world-changing revelation? You don't seem to know much about computers and technology, apparently.

:rolleyes:

G41 supported DDR3 as well. We have been slowly retiring them at work in favor of inexpensive i5 machines, but so far all of those old G41 chipset systems had DDR3 and they all still work fine. It may not have been the enthusiast's chipset, but OP was right in stating it was a corporate stable platform.

It is too bad that the majority of software, especially business/productivity software is not written with multi-threaded architectures in mind. The new i5s definitely move faster than the Core 2 Duos, but they are almost never utilizing more than the first core. I always feel like there's a lot of potential going to waste with these machines.
 
something is obviously different from my builds and those people's recommending dual cores for everyday work.

when I use my G3258 it bogs down until I get a game up and running...*then* it's relatively smooth sailing.

but if I try to do any kind of actual computing I can really feel the lack of cores.
and by actual computing I mean checking emails, surfing the web, incidental backgrounds tasks, and while a game is idling in the background--stuff I would consider a normal afternoon at home.

if I try to use a VM it's painfully slow.
 
Q6600 cpubenchmark.net score 2990
G3220T score ~ 2793.

Which means the per-core performance is about double your Q6600.

IMHO the i3-4370 @ mc with free (or almost free motherboard) should be a great upgrade of my wife's Q8200.

Q8200 = 2835
i3-4370 = 5438

Basically twice as fast.
 
I recently upgraded my C2D machine to a Q6600 a few months ago for shits. Great CPU IMO. Lacks single threaded performance, but the extra cores really help keep it afloat these days for being what 8 years old? My C2D is now running Windows XP with a 5:4 monitor and playing classic games like a champ. I love Core2. In some ways I wish I woulda jumped to Nehalem though, those i7's aren't much newer than Core2 and STILL perform beautifully.
 
Last edited:
Awww this thread is getting me all misty!
My Q6600 G0 stepping beast is about to be retired to an HDPC-XMBC setup.
I've not had a single problem with it for almost a decade... damn that was definitely worth the money!
Now it's tax season and time to upgrade, most likely to an i7-5820k (everyone tells me the speed boost is gonna make me pee my pants!)
Still, even running as my HDPC/guest computer I bet the Q6600 will still run "everyday" shit until it melts down on it's 100th birthday (best guess for timeline).
 
This isn't the gigahertz wars anymore. Though those were fun times :D Remember when [H] was so happy when they pushed a celeron slot A 233 into 800mhz?



Slot1_Celeron_1.jpg



procesador-intel-celeron-366-mas-placa-adaptadora-de-slot-1-4200-MLA2859436974_062012-F.jpg

Celeron 533s usually went to 800. Before that, people were overclocking 300As to 450 and 366s to 550. I don't believe 233 -> 800 was ever a common thing.

And AMD was Slot A :p

UPDATED: Had a typo
 
Last edited:
Celeron 533s usually went to 800. Before that, people were overclocking 300As to 350 and 366s to 550. I don't believe 233 -> 800 was ever a common thing.

And AMD was Slot A :p

Pretty sure the 300MHz Celeron was commonly OCed to 450MHz, not 350. (Maybe a typo?)
 
Back
Top