The Official DOOM 3 [H]ardware Guide thread.

The P4 3.0/ i875/ 1 gig system with a 9800 pro says it will run at 1024x768 on medium quality. I really need it to run at 1280x1024 so i can run this game on my brand new 2001FP, i have my system slightly overclocked on both the cpu and vid card so heres to hoping!
 
... ick.

Avid gamer and Celeron just really don't go well together...

You'll be ~fine~ on video for low quality, but that cpu is gonna stutter like something mad, by my guess. And that ram is bad as well, but you've got an ok amount...

Good luck, buddy. Get a job.

I think we all can agree that being a gamer is not in a budget or in a computer, its in the heart. I have a job, but all of the money goes to paying for college tuition, room, board, books, etc.

That's a Tualatin Celeron with 256KB L2 cache. It runs about as fast as a Coppermine P3 of the same speed, definitely not slow. Like all P3s, as long as the benchmark isn't FSB limited (i.e. limited by memory bandwidth), a 1.54GHz Tualatin Celeron equals a faster P4 in overall performance.

See http://www.xbitlabs.com/articles/cp...leron1a-oc.html

IOW, there's nothing wrong with a Celeron @ 1.5GHz for Doom3.

Thats sort of what I was thinking too. The question is, is this game going to be limited by memory bandwidth? I think most games are, but whether its enough to kill performance, i sure hope not.

I can go 1540mhz, 1596mhz, 1638mhz, or 1708mhz, which PCI bus's of 36.6, 38, 39, and 40.6 respectively. I know the higher the better, but whats the max PCI bus speed before the hard drive get crazy. I suspect 38 is pushing it.

I am still trying to decide whether or not to buy it for these reasons. I know it would be worth the $40, which is about all I have, but if i cant play it then there is not point.
 
jamesrb said:
I think we all can agree that being a gamer is not in a budget or in a computer, its in the heart. I have a job, but all of the money goes to paying for college tuition, room, board, books, etc.
t.

Yeah, I know. I gamed for a long time on budget systems. I was thinking of the older celerons, that were total crap... Or the new ones. And I'm a college student as well, so I know how that goes.
 
Let me get this straight. Kyle and co. didn't use a timedemo per se but ran around a level doing approximately the same things, looking approximately in the same places and hoping the environmental (physics and ai) demands remained approximately constant during these run throughs. First, I find it a little amazing that they could do the same run through hour after hour. That would be taxing. Second, this method likely explains some of the numbers that don't quite make sense at times, especially on the systems that are video card limited. I don't believe I've ever seen a performance demo of a game taken this way before where the tester had to try and resimulate their actions. I guess close counts and the sheer number of run throughs gives everyone a pretty good idea of how those systems will generally perform.
 
So, just to clear up how these things might work...

I plan to run the game with my 6800 GT at 1280 x 1024 at high settings. In my driver control panel, I have the AA and AF "application-controlled" checkboxes OFF. I manually set them before playing a game. Usually, I leave these set at 4x AA and 8x AF. When Doom gets here, if I run the game with those settings on, but in the Doom control panel, set AA to be OFF and set to high quality, will the game be running in 0x AA/8x AF, or 4xAA/8xAF?
 
aznxk3vi17 said:
So, just to clear up how these things might work...

I plan to run the game with my 6800 GT at 1280 x 1024 at high settings. In my driver control panel, I have the AA and AF "application-controlled" checkboxes OFF. I manually set them before playing a game. Usually, I leave these set at 4x AA and 8x AF. When Doom gets here, if I run the game with those settings on, but in the Doom control panel, set AA to be OFF and set to high quality, will the game be running in 0x AA/8x AF, or 4xAA/8xAF?

NV control panel overrides any in-game settings. You can override the 8xaf/etc from there if you so desire.
 
theelviscerator said:
time will tell...methinks you better get it...
I have no doubt that you are correct. Loading times would decrease, and the game should be smoother with it. I'm just a bit hesitant about picking up another stick of KVR, since that's all I'm willing to spend on RAM right now. :(
 
Great review. I might be a little late on this because Ive been out of town (traveling threw Atlanta and then Alabama), so it's hard to get by a computer. :( Glad I saw this. Good job Kyle + Brent. Read every word, and I got no complaints. :)

Cant wait to get home for some Doom III lovin :D

I'm also looking forward to that Newegg contest with the new mobo and amd64 3800+, I would love to play D3 on that. :cool:



sorry about all the smiles, this just makes my day so im in a good mood
 
Congrats on the Doom3 guide, I enjoyed it.

One thing... you guys tested out a GF4 MX card, but in the minimum Doom3 specs, it lists that it requires a 64mb card with DX9 support.

As far as I know, GF4 MX's do not have DX9 support. I know because I own one (MSI GF4 MX 440 64mb), and I can't do DX9 with it.

Perhaps there's something I missed, but I thought I'd point it out to you guys.

Thx,
Kalle
 
Quick question:
The sys I'll be running Doom3 on will be a

Athlon XP Barton 2500+ at 2.2Ghz
128mb 9800pro at 9800xt core speeds (I haven't fully explored the range of the memory oc)
1 gig of Geil Golden Dragon pc3700 at 432Mhz (currently limited by a max fsb of 217 on my nf7-s)
two sata 120GB's in RAID-0 with the os and random stuff on a separate IDE 80gb. (added this to see if the striped RAID would have an impact on the texture load times)

I've pretty much concluded that I should be able to run at 1024*768 in high, but was wondering if I might be able to push it to 1280*1024 and maintain an average of above 30fps?

Again, congrats on the guide guys.
 
She just came in and handed me a prepay reciept from ebgames for DOOM III

my bday is the 4th..hehe...God how I love that woman...
 
Non-DX9 cards will do fine, as they showed with the GF4 MX440, but you won't get some of the eye candy. As they mentioned, Heat Haze is one feature that only works with a DX9 card. You'll notice they tested with GF3 cards too, and those aren't DX9 compatible either.
 
The hardware guide seems to be very well done. Although, it has completely confused me about my next hardware purchase. I was planning on upgrading my CPU as that seems to be the current bottleneck in my system, with my 9800pro being able to perform at a higher level. However, after looking at the hardware guide, it seems my money would be better spent upgrading to a 6800 series card. I mainly compared the rig closest to my system (the 3200+) to the one I am looking at (the 3000+/3500+ planning on some oc :D ) and noticed very little performance gain with the cpu upgrade. However, upgrading the videocard seems to give it a bit of a boost. So now I'm lost as where best to spend my $400. I wonder how demanding HL2 will be on CPU... damnit, while the guide rocked, it just made my choice harder. :mad:
 
Does anyone know why the Ultra is delayed so much and where one can go to find out what status there is on these cards. Like sites that lists that there will be a million shipped next week or something?

Dr Goatcabin
 
nice guide, it's nice to know that I can run 1600x1280 w/ high. Considering I have 200 mhz and 40 fsb over the 3200+ system. I should be in good shape.
 
JethroXP said:
You'll notice they tested with GF3 cards too.

Maybe I'm missing it, but I looked over the review a couple times now and I'm failing to see the GF3 cards (although I do see the 8500)...

There's lots of charts...maybe I'm just missing it...
 
There is a difference between DirectX9 compatible (heck TNT's are DirectX9 compatible). and DirectX9 hardware.
 
dr_goatcabin said:
Does anyone know why the Ultra is delayed so much and where one can go to find out what status there is on these cards. Like sites that lists that there will be a million shipped next week or something?

Dr Goatcabin
It has to do with not have enough GDD3 memory
 
GREAT GUIDE,
all meat and potatoes.

WHAT TWEAKS WORKED?

since we know the doom3 engine will be around forawhile. what tweaks worked? ram latency? fsb? bandwith? vid core? vid mem? amount or speed? etc. etc. etc.

i kept looking for a common denomitor among the differnt systems to see what made a larger performance jump. but didnt notice any.

a64@ 2.6ghz
1 gig ram
6800gt 430/1150

should be just fine. but id like to have a head start on which direction i should tweak the system
 
theelviscerator said:
they clocked theirs around 800 mem..most people are running 1100ish now, they may be backtracking as D3 starts kicking in the thermal throttling.
Please do not give people the wrong data, we did not downclock anything, our card we OC was 6800NU.
 
enochian said:
In the official Doom3 benchmarks John Carmack said that overclocked hardware could have troubles running Doom3 correctly. I recently purchased a BGF 6800GT OC , should I expect problems? When you benchmark this card did you clock them back to recommended specs?
-----------------------------
My system
AMD 64 3200
K8 pro
BFG 6800GT OC
1 Gig ram
All of BFGTech's OC cards were run in OCed mode, and they worked just fine.
 
Wow great job on the testing Kyle and crew - 65 hours over 3 days of ingame testing is simply incredible! I really like the fact its actual gameplay results and not timedemos, etc.

Can't wait to run Doom 3, and I'm happy to know I'll be able to crank it up at 1280x1024 on my system in High Quality.

Kared
 
Xrave said:
Maybe I'm missing it, but I looked over the review a couple times now and I'm failing to see the GF3 cards (although I do see the 8500)...

There's lots of charts...maybe I'm just missing it...


It's there

dr_goatcabin said:
Does anyone know why the Ultra is delayed so much and where one can go to find out what status there is on these cards. Like sites that lists that there will be a million shipped next week or something?

Dr Goatcabin

No GDDR3 memory.

theelviscerator said:
She just came in and handed me a prepay reciept from ebgames for DOOM III

my bday is the 4th..hehe...God how I love that woman...

Lucky.
 
Nvidiot said:
He is definitely entitled to his opinion.

Also, given the great performance he is currently getting with single cpu systems maybe there IS no justification for SMP. I only wonder if his thinking will need to change as we move forward in gaming / graphics engine design.


The way current HT technology works is not ideal for FPS gaming. They slow down the latency of the threads so that two can be processed at the same time; It has twice the pipe-line length. Slow core latency won't work for all of the on-the-fly calculations that most FPS engines do. Its impossible to keep everything in-sync when the core slows the processes just so it can split the thread. Quite simpley the technology needs to change before Carmack will change his view. Do some re-search so that you may understand better than I could explain.
 
Xrave said:
Maybe I'm missing it, but I looked over the review a couple times now and I'm failing to see the GF3 cards (although I do see the 8500)...

There's lots of charts...maybe I'm just missing it...

Sorry for the confusion, they didn't benchmark with a GF3, but they did use one for their Image Quality test. Look in the page "DOOM3 Image Quality cont." in the section titled "Value Video Card Image Quality:" You'll see that they took a screenshot with a 64MB GF3, running at 640x480 on low quality.

The point being that GF3 is not a DX9 card, but it clear played the game just fine.
 
CrimandEvil said:
Did you guys try BGF's water cooler 6800U with D3? That would totally rock! :D
Who do we look like total geeks that would take a frigging watercooling system to id offices with us?
.
.
.
.
.
.
Ok, I admit, we took two actually, but they were just used to keep the noise down on our systems that were not cased.
 
Cablestein said:
Congrats on the Doom3 guide, I enjoyed it.

One thing... you guys tested out a GF4 MX card, but in the minimum Doom3 specs, it lists that it requires a 64mb card with DX9 support.

As far as I know, GF4 MX's do not have DX9 support. I know because I own one (MSI GF4 MX 440 64mb), and I can't do DX9 with it.

Perhaps there's something I missed, but I thought I'd point it out to you guys.

Thx,
Kalle
With a DX8 card such as that you will miss out on the heat haze effects. It does add a lot to the game as well.
 
aznxk3vi17 said:
So, just to clear up how these things might work...

I plan to run the game with my 6800 GT at 1280 x 1024 at high settings. In my driver control panel, I have the AA and AF "application-controlled" checkboxes OFF. I manually set them before playing a game. Usually, I leave these set at 4x AA and 8x AF. When Doom gets here, if I run the game with those settings on, but in the Doom control panel, set AA to be OFF and set to high quality, will the game be running in 0x AA/8x AF, or 4xAA/8xAF?
We did not experiment with any driver panel changes, just did not have time. That said, they are supposed to override the in-game settings, but I am unsure.
 
nice review!

any word on if newegg will put upp a system that matches the "ultimate gaming rig" in their bundle?
would be nice to see what it will cost...even if I cant afford it :)
 
will i see a difference in performance if i get a new motherboard (keep the same proc and memory) that has 8x AGP as opposed to 4x AGP?

to be more specific, would i be better off buying an ABIT is7-E to replace my ASUS P4s533
 
morinaga said:
Let me get this straight. Kyle and co. didn't use a timedemo per se but ran around a level doing approximately the same things, looking approximately in the same places and hoping the environmental (physics and ai) demands remained approximately constant during these run throughs. First, I find it a little amazing that they could do the same run through hour after hour. That would be taxing. Second, this method likely explains some of the numbers that don't quite make sense at times, especially on the systems that are video card limited. I don't believe I've ever seen a performance demo of a game taken this way before where the tester had to try and resimulate their actions. I guess close counts and the sheer number of run throughs gives everyone a pretty good idea of how those systems will generally perform.
All written out for you already in the article under " HOW WE TESTED". Here is it again just in case.
http://www2.hardocp.com/article.html?art=NjQ0LDI

"All of our testing was done at the id Software offices in Mesquite, Texas. Over a period of three and a half days, we put in 65 hours of testing that consisted entirely of DOOM 3 gameplay. As such, our conclusions here are based on real gameplay and not on timed game demos or synthetic benchmarking tools of any kind.

We sat down with id Software’s Marty Stratton to decide on some levels that would test out the extremes of gameplay and visuals. From that discussion, we chose the “Enpro” level as the focus of our gameplay testing. Our graphs show about six minutes of real-world DOOM 3 gameplay that comprises the first half of the Enpro level. The graphs you'll see will show the game's framerate measured in frames per second as we travel through the level. FRAPS was used to collect the fps data. During our testing we are performing the actions of any would-be player. We are shooting monsters, jumping, strafing, hiding, dodging, ducking, and blowing up all sorts of stuff as we move through the level. Unlike a timed demo, this real-world gameplay stresses the CPU by using the AI, sound, and physics engines, providing a record of real gameplay system performance.

Since this is actual gameplay, the events happening are not identical in nature. Some run-throughs are a bit different than others, but we have tried our best to keep a level of consistency so that the results are comparable.

Before doing our DOOM 3 FRAPS run-through, we played through several maps experimenting with different levels of resolution, antialiasing, and texture quality for each video card. From that we have picked what we thought to be the “best playable” quality supported by the system in question.

Again, we think we've been conservative in our opinions of the playable levels of IQ shown in our guide. It is totally plausible that you might find 1280x1024 a gameable resolution where we have suggested 1024x768 as the “best playable.” We do think that most gamers will agree with our conclusions here. We've played it a bit conservative because we know that many of you will use this guide to base your future computer hardware purchasing decisions on. We certainly do not want you to use our guide and come up disappointed with your purchases. We think you have a much better chance of being surprised and getting better results than ours, but for the most part we think that we are going to be dead-on in our decisions pertaining to IQ and fps performance. "


If you have issues with how the data was collected, I highly advise you to not read any more beyond this point and never return to HardOCP again as this is the way it is going to be done here in the foreseeable future.

If you need more of an explanation of the way we do things and would like some of the details behind it, feel free to read this editorial.

http://www.hardocp.com/article.html?art=NjE2

Bottom line is that you will need an open mind to accept the way we do things around here now. That said it is not for everyone, just the ones that want to play games.
 
t10 said:
So when are you guys spilling the beans on 6900U SLI 2048x1536 set-up? ;p
Did not run one, but we did run our Ultimate system at 2048x1536 and it was very playable.

r_customheight 1536

r_customwidth 2048

r_mode -1

 
Unless the game is actually using all of the 4x bandwidth(compare 8x agp to the pci express cards and scale down for an idea) you wont see a large increase; Might do better to invest in a new Vid card. Just from my experience....
 
Micas said:
This is odd:

GPU/CPU/MEMORY.............VideoSize/Quality.........MinFPS...MaxFPS...AveFPS
6800GT/Intel 3.0/1gig...............1600x1200 HQ..................35............60...........56.5

GPU/CPU/MEMORY.............VideoSize/Quality.........MinFPS...MaxFPS...AveFP
6800GT/Intel 3.4EE/1gig..........1600x1200 HQ..............27............60...........55.3

notice the minFPS...
You need to read the How We Tested page again. Trust the graph, not the chart. A one second spike could be the source of that number.
 
shing said:
How customizable are the video settings?
Can we disable things like 8x anistropic or set them down to 4x or something?
Either through the menus or via the console?
You will have to test this out with the driver control panel and we did not have time to get into those variables.
 
Dosomo said:
Great guide guys! Awesome info! I do have a question though.

I know 5.1 sound is the best way to go, but i currently have a Klipsch 4.1 surround system. I was just wondering, did you guys try out a 4.1 system? Or do you know how much of a difference a 4.1 vs a 5.1 system will have in terms of sound quality? Just curious.
Not tested....
 
Dweiss said:
Unless the game is actually using all of the 4x bandwidth(compare 8x agp to the pci express cards and scale down for an idea) you wont see a large increase; Might do better to invest in a new Vid card. Just from my experience....

I just bought a new 6800nu. so i figured if it was also worth upping the mobo to 8x for this game to get more performance outta the card.
 
Back
Top