How long before Dual-Core gaming?

What do you mean by Dual Core gaming?
As in Games optimized for Dual core Procs?
If so then I guess 1 year for most current games out their and any future titles.
 
2-3 years perhaps for complete adoption but I wouldn't be surprised if the unreal 3 engine has some support.
 
{NG}Fidel said:
What do you mean by Dual Core gaming?
As in Games optimized for Dual core Procs?
If so then I guess 1 year for most current games out their and any future titles.

Ohh man Either U know something that I dont or Ure smoking some strange stuff..

(Pls say Ure an insider and that the games are beingprogrammed as we type :p)

It took 4 yrs to develop D3 and a couple of years to dev HL2 and even MOO3 took a couple of years. No from where Im coming 2-3 years minimum probably closer to 4-5 unless Intel releases some Miracle DC tool that breaks 1 thread code uop into 2 threads.
 
well i'm pretty sure unreal3 has been in development for a while now, and supports multithreading. how well it takes advantage of dual core? i don't know, but it's a start :p
 
Nobody ever really explored multiprocessing in games before, except for experiments like Quake3... but apparently that wasn't very successful, because Doom3 abandoned it again.

Futuremark is coming out with a multiprocessing benchmark...
We'll have to see what they've split up... and see how much it affects games...
Until someone comes up with something good, it may just be more trouble than it's worth to try and support multiprocessing.
 
(cf)Eclipse said:
well i'm pretty sure unreal3 has been in development for a while now, and supports multithreading. how well it takes advantage of dual core? i don't know, but it's a start :p

Man Ure the [H]ardness Supreme around here. I mean I just can't post any where wo U bouncing in and correcting me :D

LoL

Keep correcting man!
 
badasspenguin said:
I did an interview with 2 of the developers on the upcoming Morrowind sequel, Oblivion, and they stated the Oblivion would support dual core processors and would take every advantage it could with it. You can check out the full Q&A here: http://www.evilavatar.com/forums/showthread.php?t=1356

Right now I believe we'll see that game this coming fall.

Im behind a firewall/blocker could u summarize it or just rip the relevant parts and postem here pls :)
 
I think the problem with multi threads in games is keeping everything synchronised, which would be a complete nightmare with a first person shooter, but with other genres it could bring more to the party. For example, in the Morrowind-style rpg and in RTSes path-finding and ai can be passed off to the currently unloaded cpu while the graphics are being organised by the other.

With something like networking, however, you don't want any changes going on while you package up all the info on current player and bullet placements and facings, making multicpu fpses "pause" the game. The only things that could be processed at the same time is something that doesn't get transmitted (for example a special effect of an explosion going off or the smashing of a crate - "crate smashed" is sent across the network and all the clients decide what it looks like from the player point of view). These pauses aren't apparent to the player, they last less than a frame and prediction eliminates their effect totally, but multicpu doesn't mix too well with multiplayer, and won't bring a massive benefit for a while.
 
Frallan said:
Im behind a firewall/blocker could u summarize it or just rip the relevant parts and postem here pls :)

Sure, here's the section from the interview:
bapenguin: Will there be seamless loading between zones? Does it still load when entering a "dungeons?”
Jashin: Yes, that's a very important, and often overlooked part of the gameplay experience.
kathode: We will still be going with loads between interiors and exteriors. This is because it allows us to dramatically increase the detail of our environments in general. You don't have to worry about keeping around or loading on the fly a lot of art for the interiors while you're walking around the exterior world. For exteriors we are doing everything we can to keep loading as transient as possible, including aggressive implementation of multithreaded code. I can't promise a completely seamless experience, but we are doing everything we can to maintain very high fidelity environments and keep loading down to a minimum.
Jashin: Fair enough.
bapenguin: Any plans for dual processor support then? Considering the upcoming dual core processors from Intel and AMD this year, which could be very beneficial.
Jashin: As you can see, bap’s all about tech.
kathode: Oblivion will definitely benefit from a multi-processor setup.
bapenguin: That rocks hardcore!
 
That rocks hardcore!
agreed. kinda reminds me of halo. once the level loaded, loading each section as you ran into it took like 1/2 second. beats a lot of other games i play where it's like "NEVER!! YOU MUST WAIT!"
 
I remember a game called Hawkeye on my C64 that ran off tape. When you got to the end of a level the Hawkeye bloke would run along into a power charger type thing and get zapped while the game totalled up your score and bonus points. While it was doing this it was loading in the next level. Almost 2 decades on we still don't have seamless! :D

Seriously though, the whole point of DMA (Direct Memory Access) is so that the CPU doesn't have to get involved. Games should be seamless by now. Maybe with another CPU it will be possible, but the "Loading..." screen isn't going away anytime soon, I feel.

I've been watching STALKER for a while too - the constant slippage is agonising, but it looks like it will be worth the wait.
 
ColinR said:
I remember a game called Hawkeye on my C64 that ran off tape. When you got to the end of a level the Hawkeye bloke would run along into a power charger type thing and get zapped while the game totalled up your score and bonus points. While it was doing this it was loading in the next level. Almost 2 decades on we still don't have seamless! :D

Yea, Hawkeye was fantastic... real codeporno. The music was incredible too, by the way, I believe Jeroen Tel made it (Dutch guy \o/). Still worth listening to today, via SIDPlay or such :)
On Amiga there were also some very cool games... One of them was called SWAT, if I recall correctly... It was a vertical scrolling shoot-em-up... The beauty was in the levels... They were HUGE, and incredibly detailed... yet the game only required an Amiga with 512 kb... The clever bastards made the levels load from disk in the background while you were playing. No waiting at all, and perfectly smooth ofcourse.
Then again, having a fixed hardware platform, and working without any kind of OS underneath that tries to multitask in the background, has its advantages.
Halo was mentioned above... typical console game... The beauty is that most of the console-like 'no-wait' feel is preserved on PC.
 
ColinR said:
I
With something like networking, however, you don't want any changes going on while you package up all the info on current player and bullet placements and facings, making multicpu fpses "pause" the game. The only things that could be processed at the same time is something that doesn't get transmitted (for example a special effect of an explosion going off or the smashing of a crate - "crate smashed" is sent across the network and all the clients decide what it looks like from the player point of view). These pauses aren't apparent to the player, they last less than a frame and prediction eliminates their effect totally, but multicpu doesn't mix too well with multiplayer, and won't bring a massive benefit for a while.


Networking is already done in a seperate thread. Real time online games wouldn't be playable if it weren't. If you sat around waiting for the server to gather up all the client data, and then update all the clients with the new information each time before you tried to do anything that would change the game state (like gathering user input, pathfinding, AI if any, collision detecton, ect) you'd probably manage from 1 to maybe 4 or 5 FSP.
moving it to a seperate core won't be a big deal. (Won't help a whole lot either as netcode is pretty small chunk of your CPU cycles)
 
as i understand it the biggest benefit's will be for AI and physics. in 1 of anand's articles Tim Sweeney of Epic said something to the effect that trying to split rendering between 2 proc's was not only a nightmare for coders but the end result would be no better than running on a single proc.
 
I'm excited about high speed dual core chips, b/c if Velocity is already selling OC'd 3.6, 3.8 & 4.0Ghz machines, that means there's a lot of headroom in these chips.

Probably b/c of the large die/surface space. (like MO / Gallatin chips)

I'm thinking I'll be able to buy an 830, or 820 (non-EE) chip an OC to 3.6Ghz minimum... and probaby 4Ghz.
 
I've made simple game apps before and from those experiences I thought games HAVE to be multithreaded...that is to say, there needs to be at least one thread for the automated parts of the game, e.g. the animation, and another thread for user input. As I understand it, if there were only one thread, the CPU would keep running the automated components and the user would not be able to interact with the game. I have no idea how large game engines work so I wonder if this idea still applies.
 
dualityim said:
I've made simple game apps before and from those experiences I thought games HAVE to be multithreaded...that is to say, there needs to be at least one thread for the automated parts of the game, e.g. the animation, and another thread for user input.

Depends on how you do it... I prefer to have the input queued in the background, and then handle the input once per frame (no need to do it more often anyway, because you won't be able to give any visual feedback faster than that).
The advantage is that you don't need an extra thread or extra synchronization... which reduces overhead... back in the old days (486/early Pentium era), it actually made realtime things run smoother too... even something as simple as a video player.
 
Back
Top