Lost Planet Benchmark: What's your numbers?

AuxNuke

Gawd
Joined
Nov 11, 2001
Messages
605
Just wanted to start a small database to compare and contrast some framerates on this demo using the built in benchmark.
Demo download: http://www.nzone.com/object/nzone_lostplanet_downloads.html
Along with your hardware and current clocks, be sure to Specify which OS, graphics driver version, and Lost Planet version (DX9 or DX10) you are using. Thanks!

CPU: E6600 @ 3.4
GPU: eVGA 8800GTX (625/2000)
OS/DX: XP Pro SP2, DX9
GPU Driver: 158.19

1) 1920x1200, 4xMSAA / 16xAF, Max In Game Settings
Snow: 34
Cave: 51

2) 1600x1200, 4xMSAA / 16x AF, Max In Game Settings
Snow: 40
Cave: 59
 
I wasnt trying to be rude, please dont take it that way. One post was in the ATi subforum, the other is for DX10 numbers. Just trying to give you some numbers to chew on.
 
I wasnt trying to be rude, please dont take it that way. One post was in the ATi subforum, the other is for DX10 numbers. Just trying to give you some numbers to chew on.
No worries! I didn't see a post about it on the first few pages of the main video card section and didn't check the "General gaming" section. Must be the old age getting to me ;)
 
I still think it's relatively useful to have a post here, for the general vicdo card viewers, who're interested. The thread for the front pages news is mostly a preview of a review Brent Justice will be doing.

So, in that spirit..

lostplanetbm2av4.jpg


CPU: E6600 @ 3.2
GPU: eVGA 8800GTS (640/2000)
OS/DX: Vista Ultimate x64, Dx10
GPU Driver: 158.43

1) 1280x960, 4xMSAA / 16xAF, Mostly High (Except HDR, Shadow Quality, Shadow Resolution are @ Med.) In Game Settings
Average: 38
Snow: 38
Cave: 50


I did notice something rather odd, though. EIST activates during outdoor scenes. Thus lowering my cpu clock --and subsequently my cpu utilization-- which in turn seems to bring my avg. in the Snow and overall lower. Now, I know Dx10 is supposed to offload a good deal of the processing to the GPU, and only the CPU when necessary, but this seems to be an implication of more than just that. It's strange, and I'd be shocked if these doesn't change with future releases of this game or other titles. I've considered this to have correlation with the graphics card drivers, but not sold on the idea. Odd if nothing else.

Sadly, I can't seem to get the correct max resolution to work on my 21" BenQ LCD --it's just not available. And the first run through I was able to run the performance test and actual game play demo worked on my 24" monitor @ 1920x1200; after that first run, it started to load and constantly resulted in a black screen (not full desktop blackout) shows a black screen. That's more than likely a driver bug.
 
Sadly, I can't seem to get the correct max resolution to work on my 21" BenQ LCD --it's just not available. And the first run through I was able to run the performance test and actual game play demo worked on my 24" monitor @ 1920x1200; after that first run, it started to load and constantly resulted in a black screen (not full desktop blackout) shows a black screen. That's more than likely a driver bug.

you can edit the config file under /user profile/local settings/capcom/LP/config.ini file and manully change the reso.

i did this and it seemed to work in game but i think the stress test reverted back to default reso ...
 
you can edit the config file under /user profile/local settings/capcom/LP/config.ini file and manully change the reso.

i did this and it seemed to work in game but i think the stress test reverted back to default reso ...

Word. :cool:

You 'da man. I looked, and all i found was the .cfg file in the root, so wasn't sure how they were storing the config options.
 
Word. :cool:

You 'da man. I looked, and all i found was the .cfg file in the root, so wasn't sure how they were storing the config options.

no prob..

btw the LP demo crushed my comp only got ~18fps at 1300x1000 reso :(
Win XP Pro SP2
AMD 4200
X1800GTO
 
Sorry to hear it, brothaman. But, besides a few demos --the Adriana demo and Cascades demo do look much better in person than Screenshots! they just don't do either justice.-- you really aren't missing out on much.

As long as your gear is playing the current games well, you've still got time!
 
snow:41 cave:42

I left everything on default, but i noticed that the AA was default to 4AA. Is this the way it's supposed to be? Or is it because I typically force games to 4AA through the nvidia CP by default.

Numbers are at 3.2ghz, no oc on the video card
 
AA defaults to 4x, yes. By the way if anyone can get Crossfire working on this could you post how? I'm persistently seeing a drop over a sinlge X1900XT.
 
QX6700@3Ghz 8800GTX (160.03 drivers) running 1920x1200 with 4xAA/16xAF and everything maxed except medium HDR, and medium shadows.

4 Cores
39 Snow 55 Cave

2 Cores
38 Snow 55 Cave

1 Core
38 Snow 35 Cave

* Results taken on WinXP Pro running DX9 Codepath
 
Cos you haven't switched on "FPS view" in the options :p

Oh, there is a 1st person option then? I was just reading about it on their site and it said 3rd person, which scared me, caused 3rd person shooters in my opinion suck.
 
I may be out of Place with my Lowly PC, But I think it will give a Real World Score of what most people can expect.

Intel Celeron D @ 3.46 Ghz
MSI GeForce 8600 GTS OC
1280 x XXX
Snow: 23 Cave: 11

I didn't notice any annomallies or anything just ran without problems, maybe not the going to be the highest score, but adequate.
 
I may be out of Place with my Lowly PC, But I think it will give a Real World Score of what most people can expect.

Intel Celeron D @ 3.46 Ghz
MSI GeForce 8600 GTS OC
1280 x XXX
Snow: 23 Cave: 11

I didn't notice any annomallies or anything just ran without problems, maybe not the going to be the highest score, but adequate.

Thanks for the perspective from the owner of an older system, but when you say adequate..

Adequate for what?
 
Thanks for the perspective from the owner of an older system, but when you say adequate..

Adequate for what?

Adequate for movie watchin ;) Those are some sucky scores

CPU: AMD X2 4200+ @ 2.64ghz
GPU: 2x EVGA 7800GT in SLI (480/1200)
OS/DX: XP PRO 32bit, SP2, DX9c Aug07
GPU Driver: 93.71
RES: 1280x960

Turned of AA/AF and enabled Multi-GPU, everything else default

Snow: 35
Cave: 39
 
IMO, the game runs for shit, way too much snow blowing around in the outdoors and the HDR is too strong....it's like a beacon shining from the sun in my room at night....but not a big deal...

...and the game runs slow as shit, considering the graphics aren't that special to begin with....my numbers are in line with everyone under Vista with an 8800GTS and the 160 drivers....but the framerate still sucks.
 
I left the game at all defaults (it picked 1280x720 for my 16:9 display; I wasn't even allowed to change to any higher resolutions; other games work fine, however).

E6600 @3.3GHz
8800GTX (Stock)

DX10: Snow: 66, Cave: 63
DX9: Snow: 79, Cave:64

Both DX9 and DX10 were tested using Vista.

My Rig is:

Intel D975XBX "Bad Axe" Motherboard (Intel 975X-ICH7R) (BIOS 1463)
Intel Conroe E6600 Stepping 6 Rev B2 @3.3GHz
GIGABYTE G-Power Pro Cooler GH-PDU21-MF (Heatsink/Fan)
Seasonic USA S12 Energy Plus 650W Quad Rail PSU (Active PFC)
Corsair TWIN2X2048-6400C4 2GB (2*1GB) DDR2 PC-6400 (@2.1V)(4-4-4-12)
eVGA 768MB e-GeForce 8800 GTX GDDR3 PCI-E
Emprex HD-3701 37" LCD Display - High Resolution (1920x1080x32)
Matsh1ta DVD-RAM RW SW-9585
Western Digital WD360GD SATA 36GB (Drive C)
Western Digital WD1200JD SATA 111GB (Drive D)
3*Maxtor SATA 149GB RAID0 (Drive E)
Intel 82573E/82573L Gigabit Ethernet (Driver 9.5.12.0)
Creative Sound Blaster X-Fi XtremeMusic (Driver 5.12.6.1187)
Antec Performance TX640B Mini-Tower Case
 
FPS view displays your framerate in the top left corner lol.

Does it? That's handy, I'm that used to console commands for that I didn't think it might do that. So it's stuck in 3rd person mode then?
 
DX9C / Windows Pro. 160.03 drivers. All tests: 4 * AA / 16 * AF, everything high, except shadows (med).

960 * 600
Snow: 125
Cave: 78

1280 * 800
Snow: 90
Cave: 78

1920 * 1200
Snow: 47
Cave: 75

2560 * 1600
Snow: 28
Cave: 51

Tuniq Tower 120
EVGA 680i SLI
QX6700 @ 3466mhz (13 * 266mhz).
4 * 1024 MB RAM CL5.5.5.5 T2 @ stock
2 * XFX 8800 GTX 768 MB @ stock
X-Fi Fatal1ty Pro
WD Raptor 36 GB
WD RaptorX 150 GB
Dell 3007WFP-HC
 
Quite frankly everyone should just plain ignore this game for benchmarking, it's built on an engine which was written for the Xbox 360 and will be plagued with performance issues unless its gone through a 100% complete re-write (and I doubt that)

You can tell it's plagued with console stench, it's menu's refer to "PC settings" as opposed to just "settings", it's got console instructions left right and centre (press A to do this or B to do the other) you can't even invert the mouse unless you go into the console controls.

Any reviewer thats not screaming "Stay away from this" in the context of benchmarking really isn't doing their job very well, console ports always run like ass on the PC compared to delivered visuals, for example Splinter Cell Double Agent.

On my rig:
BFG 8800GTX @ 600/1800
E6600 @ 3.0Ghz
4Gb RAM
Vista 64bit (DX10 demo)

Default settings (1280x800 with all default)
70 FPS for snow
42 FPS for cave

On highest settings (1280x800 with 8xAA and 16xAF everything max)
21 FPS for snow
20 FPS for cave

I had to set most things to medium to get a decent in playable frame rate of 40-50 FPS throughout and to be honest it barely looks any different from max settings. It's just a fuck load of blowly snow which engulfs everything and blocks your vision so you're shooting at vague silouets. Game is very *yawn* anyhow, seen a lot of it on the Xbox360 and wasn't really impressed at all.

Bring on Crysis and some fine tuned DX10 benchmarking.
 
31fps in snow and 34 cave

Everything default with 16x aniso @ 1280x800.

When I choose 1280x800 resolution and check my LCD to see... My LCD says it is running 1440x900...

Must be a bug in the game...

I played the demo fine without any noticeable slowdowns. Pretty fluid @ 30fps since it's a 3rd person.
 
Thanks for the perspective from the owner of an older system, but when you say adequate..

Adequate for what?

When I began I upgraded from an Integrated ATI Radeon Express 1100 X Integrated Graphics/Chipset, with 1024 DDR2 4200 Memory, I upgraded to 2 Gigs of Corsair XMS DDR@ 667, which could barely play NFS:Carbon without bogging down to a stop.

The Only other PC I have to base my opinion on, is a P4 @2.4 Ghz 1 Meg Cache, 2 Gigs of DDR 400, and an ATI Radeon 9800 Pro 128 Meg that's been getting long in the Tooth for a WHile.

But thanx for the input, it did make me think about it for a minute.
 
Back
Top