Do any games fully utilize 6 CPU cores yet?

X577

n00b
Joined
Jan 24, 2013
Messages
14
It looks like many games are finally starting to use 4 cores in gaming, such as Battlefield. So is getting a 6 core CPU a good way to future-proof a PC?
 
There is no such thing as future proofing.

This. You can't "future proof" a PC. Technology moves too fast for that and by the time game design catches up with the so-called "future proof hardware" it will be weak compared to what is out at the time.

To answer the question in the title: No. Most games barely utilize 2 cores right now. Don't count on games taking advantage of 6 cores until well after the majority of PC gamers have them.
 
When it comes to PC's, the best thing you can do is live in the moment. In one year from now, most of your parts will be outdated. Now don't get me wrong, I didn't say useless, just outdated. I'm still on an i5 2500k with 2 ati-6870's and I couldn't be happier for the time being. It's not about being ready for the future, it's about owning the "now" as hard as you can this way the future moves just a little bit slower :)
 
I think you can somewhat future-proof certain components nowadays...the console domination has made PC innovation less frequent over the past few years...I've had my 6 core CPU for 3 years (as of March 2013)...so in a way I have future proofed it as games are now only slowly starting to take advantage of 4+ cores...as long as you don't use multiple monitors the concept of future-proffing is more realistic today then it was years back when tech changed at a much more rapid pace
 
I think you can somewhat future-proof certain components nowadays...the console domination has made PC innovation less frequent over the past few years...I've had my 6 core CPU for 3 years (as of March 2013)...so in a way I have future proofed it as games are now only slowly starting to take advantage of 4+ cores...as long as you don't use multiple monitors the concept of future-proffing is more realistic today then it was years back when tech changed at a much more rapid pace

I'm not so sure it's the consoles fault why the market has stagnated in the past few years. I think it's more the fact that people are moving away from desktop pcs, and hardware is reaching a temporary peak cycle. Basically you can do anything you want on a pc anymore without much wait, and until 4k or higher resolution monitors come out and become mainstream if they ever do, we won't really need much more graphics horse power for the general consumer.
 
When it comes to PC's, the best thing you can do is live in the moment. In one year from now, most of your parts will be outdated. Now don't get me wrong, I didn't say useless, just outdated. I'm still on an i5 2500k with 2 ati-6870's and I couldn't be happier for the time being. It's not about being ready for the future, it's about owning the "now" as hard as you can this way the future moves just a little bit slower :)

I agree, there is no reason to be on the cutting edge anymore. Seems worthwhile to wait for many games that require 6 cores before upgrading based on something that isn't used heavily.
 
I agree, there is no reason to be on the cutting edge anymore. Seems worthwhile to wait for many games that require 6 cores before upgrading based on something that isn't used heavily.

Agreed. Only reason is to have e-penis points for benchmarks. I bought top of the line years back and theres nothing that taxes it. I run photoshop, sketchup, 3d studio max, and things like that just fine.

Biggest thing these days is ram quantity and a good ssd.
 
I agree, there is no reason to be on the cutting edge anymore. Seems worthwhile to wait for many games that require 6 cores before upgrading based on something that isn't used heavily.

With new consoles coming up we might see game requirements increase in the next year or so.
 
I disagree with the inability to future-proof now with two exceptions: GPU and SSD.

If you bought a i7 920 and 8 gigs of memory (in November of 2008...over 4 years ago), you would still be sitting very pretty. The exception would be you'd need to have upgraded your HDD to a SSD, or not if you haven't been corrupted by the awesomeness that is a SSD. You'd also need to upgrade your graphics card as the Geforce 200 series and Radeon 4000 series would struggle on some of the newer games with more than medium settings. Of course this all depends on your preferences and how much you initially invested.

Four cores is the new standard. As people have pointed out the software industry has not even caught up to two cores. So I think buying a Sandy Bridge or Ivy Bridge, Z68/Z77 board, 8 gigs of memory, and a quality power supply would have you sitting pretty through 2016. Haswell is coming out this year and will only have moderate CPU gains, if any. It's all better on-chip graphics and lower power right now. 2014's Broadwell will just be a Haswell die shrink, more quad cores. Moore's law is proven dead.
 
probably a lot can support more than 4 cores... part of the problem is that the most heavyweight tasks are already being taken care of.
 
BF3 uses 6-8 cores when playing co-op

Just because it shows activity doesn't mean it is "fully utilizing" the cores

Has any site done a test of limiting the amount of cores to modern day games?

say 2 cores vs 3 vs 4
2 threads vs 3 vs 4 vs 8
 
With new consoles coming up we might see game requirements increase in the next year or so.

I doubt it. They will utilize the hardware that's out right now, I don't think consoles are getting anything cutting edge hardware wise over what pc's already do. If anything, consoles are going to catch up to what pc's can do this generation. I think the PC end is stalling out for a while. This can be a good thing though as hopefully there will come some good games and less focus on high end graphics. Basically I'm fine with what graphics can do now. We just need more quality innovating games, not COD 14 or BF 12 or MMO wow clone 22. ;)
 
Just because it shows activity doesn't mean it is "fully utilizing" the cores

Has any site done a test of limiting the amount of cores to modern day games?

say 2 cores vs 3 vs 4
2 threads vs 3 vs 4 vs 8

BF3 does use all 6-8 cores for me and im not stupid and talking about each one being at 20%

during co-op the cpu goes from 70-90% on all 8 cores.
 
The PC market is kinda slowed down I've had my i5 760 for two or three years now maybe more. BF3 is the only mainstream game I'm aware of that can access 8 cores.
 
I disagree with the inability to future-proof now with two exceptions: GPU and SSD.

If you bought a i7 920 and 8 gigs of memory (in November of 2008...over 4 years ago), you would still be sitting very pretty. The exception would be you'd need to have upgraded your HDD to a SSD, or not if you haven't been corrupted by the awesomeness that is a SSD. You'd also need to upgrade your graphics card as the Geforce 200 series and Radeon 4000 series would struggle on some of the newer games with more than medium settings. Of course this all depends on your preferences and how much you initially invested.

Four cores is the new standard. As people have pointed out the software industry has not even caught up to two cores. So I think buying a Sandy Bridge or Ivy Bridge, Z68/Z77 board, 8 gigs of memory, and a quality power supply would have you sitting pretty through 2016. Haswell is coming out this year and will only have moderate CPU gains, if any. It's all better on-chip graphics and lower power right now. 2014's Broadwell will just be a Haswell die shrink, more quad cores. Moore's law is proven dead.


This is a fair argument but I don't think Moore's law is proven dead. I think PC developers have chosen different paths than making the next "Crysis" to benchmark your PC. That it takes more than slick graphics to drive a sale these days. I also think Intel has a stranglehold on the CPU market as AMD CPU's suck ass now (at least for PC gaming).
 
just online multi player uses 4 cores, so if you figure co-op needs to also have ai and events and possibly hosting the co-op server p2p doubt EA uses a server for 2 people in their cloud. it sounds about right that it would use 6-8 cores.
 
My AMD 8120 does great in gaming because I also like to do other things such as stream and render from 1080p to 720p on the fly to Twitch.tv. Yes, the Intel chips are somewhat faster at gaming than the AMD chips as the AMD cores are seeking work to do. But the AMD chips can do the rendering on the fly with very little fps loss while playing the game. To get that from Intel you want to step up to their 6 core processor which blows the AMD chip out of the water, but costs over 2 times as much.

My buddies that tried streaming on their 2600K's all have to turn their settings down for most titles. I can leave mine maxed out as it doesn't affect me as much. It's not a big deal as I enjoy watching their stream just as they like watching mine. The only exception is BF3 as it does utilize all of my cores so I have to drop a couple settings one step below Ultra.

So I'd say that if you want to stream to your gaming clan then investing in a 6 core Intel or 8 core AMD is a must. Unless you don't mind playing on lower settings of course. :)
 
So even if a game "uses" 6-8 cores, how does it run compared to four cores? I personally don't give a crap if a game uses 20 cores if it runs the same as on 2 cores. Same with the whole vram debate. Just because a game says it uses 3 gigs, how does it run compared to 1.5 gigs? If it is truly running out of vram, it would stutter like a bitch and fps would drop down drastically.
 
So even if a game "uses" 6-8 cores, how does it run compared to four cores? I personally don't give a crap if a game uses 20 cores if it runs the same as on 2 cores. Same with the whole vram debate. Just because a game says it uses 3 gigs, how does it run compared to 1.5 gigs? If it is truly running out of vram, it would stutter like a bitch and fps would drop down drastically.

well considering amd needs those cores for the cpu to be competative with intel, i would say it might need those cores to keep up with an i5's performance in that situation but i cant say that for sure since i have no way to compare them.
 
So even if a game "uses" 6-8 cores, how does it run compared to four cores? I personally don't give a crap if a game uses 20 cores if it runs the same as on 2 cores. Same with the whole vram debate. Just because a game says it uses 3 gigs, how does it run compared to 1.5 gigs? If it is truly running out of vram, it would stutter like a bitch and fps would drop down drastically.

Oh let me use BF3 as an example as it's a popular title. In single player cores don't really matter as what you just detailed in your statement is 100% true. The destruction that occurs is optimized to make any modern processor transparent from another.

In 64 man multiplayer it does matter as physics for destruction, unpredictability of other player, congestion in certain areas of the map as everyone is bunched together from time to time can't be optimized in a controlled environment. If for example one team gets pushed back into their base all 64 people are fighting in one confined area.

If you're on a dual core processor get ready for lag deaths. If you're on a Q6600 get ready to stutter to death as your processor even overclocked isn't going to cut the mustard unless you put things on low and medium. On a newer Intel or AMD FX you're golden. Everything is on high and it's as transparent as playing on single player. You can't really tell the difference.

I hope that helps explain why having more cores for a multithreaded game is good. For run of the mill, single threaded crap that is churned out of the software mill; it doesn't matter. Grab a laptop and have fun.
 
Agreed. Only reason is to have e-penis points for benchmarks. I bought top of the line years back and theres nothing that taxes it. I run photoshop, sketchup, 3d studio max, and things like that just fine.

Biggest thing these days is ram quantity and a good ssd.

I thought Windows could only make use of up to 4 GB of RAM
 
It looks like many games are finally starting to use 4 cores in gaming, such as Battlefield. So is getting a 6 core CPU a good way to future-proof a PC?

Futureproofing is a bad reason to get hardware. Get hardware that works now and don't worry about spending extra on bleeding-edge features that won't get used. Quad-core back in 2006 and 2007 was a waste of money, buying a dual core yielded much higher practical performance.

Its the same with 6-core, either save the money or use it on something that will benefit you now (faster GPU, big SSD, etc).
 
I doubt it. They will utilize the hardware that's out right now, I don't think consoles are getting anything cutting edge hardware wise over what pc's already do. If anything, consoles are going to catch up to what pc's can do this generation.
Consoles not using cutting edge hardware is irrelevant; they only need to be powerful enough — although similar you cannot compare PC to Console architecture one to one; consoles, being programmed to the metal and being a uniformed platform, will always have better optimization and will be able to put out more juice. The only case when you'll see a clear cutting edge difference is when games are developed specially for PC and that is less and less common for obvious economic reasons. So even if consoles have the equivalent of two generation old PC hardware under the hood, I would tend to share Derangel's opinion — we should indeed be seeing an increase in gaming hardware.

That being said, with the sate of the economy, games costing more than ever and studios being closed left and right... But that's another story.
 
Hardly any games use 3-4 cores, I wouldn't worry about over 4 cores for games for many years. Certainly not before Ivy Bridge-E hit's at the end of this year.
 
So would I see a significant upgrade in FPS and graphics if I went from a Core 2 Duo E6750 to an i5?
 
Depends on the game. I remember going from my e6850 to an i7 920 in Bad Company 2 and it literally doubled my framerate while using the same videocard. I was so damn happy. There are a few sites out there that show cpu scaling (both frequency and cores) very well when new games come out.
 
So would I see a significant upgrade in FPS and graphics if I went from a Core 2 Duo E6750 to an i5?

Generally speaking, the games you would see the most improvement in are large map/open world games with semi-dynamic A.I.'s and whatnot. Large scale multiplayer, MMO's. Also, games the employ a lot of CPU based physics that are nearly constantly in effect. Not just the occasional novelty death or flowing curtain.

All that said, just about any quad core will give you a boost over a dual core within the same brand, due to architecture improvements. But if your Dual Core is still "good enough" for you, then it doesn't really matter until you hit a wall with some of the game types mentioned above.

Its also worth noting that some games are more CPU dependant than others. I play BF3 multiplayer at 1080p, ultra settings, with a 7870 and an AMD Phenom II hex core with 60FPS average. If I disable in game AA and use the SweetFX SMAA injector, I basically never see the underside of 60fps, as SMAA is MUCH more efficient type of AA.

However, with Dark Souls my GPU practically idles. Its VERY CPU dependant and seems to scale with resolution. They improved CPU performance a bit in the recent patch. Before the patch I couldn't do online PVP at 1080p as I would get crippling lag spikes after I connected with an opponent. I would have to drop resolution to 1280x1080. After the patch its noticeably more consistent, but an Intel CPU for Dark Souls would still give me a noticeable boost.
 
Last edited:
Consoles not using cutting edge hardware is irrelevant; they only need to be powerful enough — although similar you cannot compare PC to Console architecture one to one; consoles, being programmed to the metal and being a uniformed platform, will always have better optimization and will be able to put out more juice. The only case when you'll see a clear cutting edge difference is when games are developed specially for PC and that is less and less common for obvious economic reasons. So even if consoles have the equivalent of two generation old PC hardware under the hood, I would tend to share Derangel's opinion — we should indeed be seeing an increase in gaming hardware.

That being said, with the sate of the economy, games costing more than ever and studios being closed left and right... But that's another story.

So wait a sec, if these next gen consoles are more like a pc, using current state or last gen pc hardware, would developers then start making the games on pc then porting over to the other consoles? It would make more sense then also if they are competing with steam box, if it ends up taking off more games will be utitlizing pc hardware first, then ported over to consoles, oh man that would be epic!!!
 
I've been using an i7 4ghz since 2010. To put in perspective, its been doing pretty damn well but the video card needed updating to keep up with newer games. If you go high end, you can keep the platform for at least a few years.

If you have the $$$, get the best you can afford and are willing to afford. Only you can determine what is a "waste" to you.
 
So wait a sec, if these next gen consoles are more like a pc, using current state or last gen pc hardware, would developers then start making the games on pc then porting over to the other consoles? It would make more sense then also if they are competing with steam box, if it ends up taking off more games will be utitlizing pc hardware first, then ported over to consoles, oh man that would be epic!!!

After the initial rush of games, console programmers have a tendency to program as close to the hardware as possible. They optimize to the nth degree to squeeze a few more frames out of the system especially towards the end of it's life. They can do this because every XBOX 360, PS3, Wii is the same not counting storage.

Every PC is different. Different motherboards, video cards, operating systems, etc. So in PC programming you do more generalized, generic programming so that what you make will suit many platforms. Thus you rarely get to program at the hardware level.

That's why you won't see PC ports on consoles.
 
After the initial rush of games, console programmers have a tendency to program as close to the hardware as possible. They optimize to the nth degree to squeeze a few more frames out of the system especially towards the end of it's life. They can do this because every XBOX 360, PS3, Wii is the same not counting storage.

This really hasn't been true for quite some time. Sure, the engine developers optimize for the hardware, and continue to do so, but not nearly to the extent as older games such as the NES and SNES did. Consoles are just too complicated now, and even consoles aren't entirely the same hardware, even if you discount storage (CPU's change, graphics cards change, etc. You just focus on having compatible drivers).

I'm not saying they can't optimize to a better degree than on the PC, but for the most part, you license an engine for multiple platforms, and focus on the larger scale of the game.
 
Back
Top