Poor 4850 and 4850 Crossfire performance in Burnout Paradise

Kelv

Limp Gawd
Joined
May 10, 2008
Messages
385
I just got Burnout Paradise and I started play single player. Instantly I noticed that my pc seems handicapped. With one card, I get 40-50 in closed areas, and 20's in areas where the city is open and you can see far away. With two cards, exact same performance.

My settings are the High settings, I've tried all the AA settings and they don't seem to impact the performance. SSAO is off.

I recently reformatted my harddrive because of really poor performance. I thought one of the problems might have been that I had installed a second 4850 which I got back from visiontek after I RMA'd one of mine that fried.

Here's the previous thread I wrote when I received the second card.


I don't really care about that issue anymore unless that could be an indication of that problem.

I decided to benchmark the cards to make sure crossfire was working. I didn't want to download all of the 3dmark06 so I dled FurMark.

Benchmarks:
FurMark v1.6.5

Score: 7692 o3Marks
FPS Min: 114
FPS Max: 174
FPS Avg: 128
Time: 60000 ms
Res: 1280 x1024/MSAA 0x
Render: ATI Radeon HD 4850
Active GPUs: 2
Drivers: Catalyst 9.4
CPU: E8400 @ 3.0 GHZ

Score: 4115 o3Marks
FPS Min: 56
FPS Max: 98
FPS Avg: 69
Time: 60000 ms
Res: 1280 x1024/MSAA 0x
Render: ATI Radeon HD 4850
Active GPUs: 1
Drivers: Catalyst 9.4
CPU: E8400 @ 3.0 GHZ

These definitely indicate that crossfire is working.

I also downloaded GPU-Z to display some statistics about the cards..
Here they are

When I installed CCC and the drivers, I just did it once, I didn't put in a card, install drivers, take out a card, put in another, install drivers again.

Hopefully I posted enough info that you guys can help me out, it will be much appreciated!
 
to make things simpler for troubleshooting, take out or disable crossfire for now. that card should run that game at max settings and 8x aa. my 8800gts does, at 60fps all day long. no ssao, but even with that on i get around 30fps.
 
You mean take it out right? I disabled it in case you didn't understand my title and/or my first paragraph.
 
you might want to consider that burnout isn't properly supported by crossfire with the current driver build.
 
Is Catalyst AI enabled?

yes, it's set to standard. Is that good or bad? I've kinda lost my knowledge on that subject but I have heard of it before.

However, I just want to remind you guys, I tested it with crossfire off and from the benchmarks I've seen, a single 4850 should be able to burnout paradise.
 
you might want to consider that burnout isn't properly supported by crossfire with the current driver build.

that would explain things right there.

op might want to try physically removing one card, at very least just to see if it helps.
 
By current driver build not being supported, you mean 9.4 crossfire?
 
I renamed the .exe to grid.exe in the games folder (make sure to turn on folder file extensions). Crossfire working perfectly with my hd4870x2 now and perfm is absolutely smooth vs stock burnout.exe.
 
I renamed the .exe to grid.exe in the games folder (make sure to turn on folder file extensions). Crossfire working perfectly with my hd4870x2 now and perfm is absolutely smooth vs stock burnout.exe.

That actually did seem to give me a performance boost of about 10 fps. I didn't try it without crossfire.

I do have some very interesting information that I just tested out. The game on full settings, max resolution for me which is 1280 by 1024, with SSAO and blur, gives me about 5 fps lower than the game with all lowest settings. I still get down in the high 20's with the lowest settings and never reach above 50. Could it be my CPU not working as much as it should?
 
at 12x10 your cpu is doing a lot of work. the second 4850 practically worthless at that res.
 
The e8400 shouldn't be limiting me in the games. Is there anyway I can test that the cpu is working correctly in games? I'm thinking about if its staying at an idle multiplier. One thing that makes the game seem like its performing worse than it is is micro stutter at low fps.

Can someone explain what renaming the game grid does?

EDIT I think I'm right. I downloaded CPU-Z and had that running along with GPU-Z to observe performance of FurMark while that was windowed. The CPU stayed at 6.0x multiplier aka only 2.0 GHZ out of 3.0 GHZ. I think I just need to figure out now how to fix that. I know that my gigabyte driver came with something that deals with this but I didn't install it.

I did another test and I noticed that during the benchmark it was flip flopping between 6x and 9x. When it was on 9x, the fps was around 280. When it was at 6x, the fps averaged 150.
I did some temperature tests and I think my heatsink isn't working properly as it's cool while Speedfan is reporting temps up to 90c while running furmark.

I fixed the problem. The cpu was throttling back because of temperature. I reseated the heatsink and now I have 60 fps constantly at max settings.
 
Last edited:
Back
Top