Vid card under $200 @ Q6600 & Q9550 ?

$160 video card upgrade for Q6600 system...
~ R9-280-3gb

Tomb Raider 2014 benchmark is used here...
~ Far Cry 3 does not have a built in tool.

Here's a quick comparison of existing conditions:
~ I did not get all formal with driver versions, etc.

Q6600 + 8gb RAM + HD-4870-512mb :eek:
@ Normal = 46 fps
@ High = 31 fps
@ Custom = 31 fps
~no other options were offered

Q9550 + 8gb RAM + HD-6870-1gb :)
@ Normal = 89 fps
@ High = 59 fps
@ Ultra = 39 fps
@ Ultimate = 25 fps
@ Custom = 14 fps

EDIT:

Here's my NEW condition: :eek: :D :p ;) :)

Q6600 + 8gb RAM + R9-280-3gb
@ Normal = 160 fps
@ High = 117 fps
@ Ultra = 84 fps
@ Ultimate = 56 fps
@ Custom = 29 fps


Screenshots:
https://docs.google.com/document/d/1Din1kE11JVSR29FdIkH60jgw4TMGXJnOE67HihTYGfc/pub

MMMMMMWWWUUUUAHAHAHAHAHAHAHAHAHAAAAA!!!!!!

:D :) :D :) :D :) :D :) :D :) :D :) :D :) :D :) :D :) :D :)


Ran my system against yours to get a baseline before I switch out the 5850 for the R9 280.

Q9550 + 4gb RAM + HD-5850-1gb :)
@ Normal = 67.1 fps
@ High = 45.2 fps
@ Ultra = 27.1 fps
@ Ultimate = 17.7 fps

I knew that the 6870 was a slight upgrade (at most 10% - 20% increase in fps) in most games, but our differences appear to me closer to 35%. My Q9550 and 5850 are both running stock clocks. Are yours overclocked? Or do you think my 4 GB of memory is holding me back?
 
Q6600 stock = 2.4ghz
Q9550 stock = 2.8ghz
I'm overclocked = 3.2ghz on both CPU's & all vid cards @ stock speeds

If the vid card memory is fully used, I think the game will also use system ram, but I am not sure.
It's been a while since I was really reading up on that kinda' stuff...

Either way, the extra 4GB in daily use is totally worth it.
When I went from 4GB to 8GB, I don't remember noticing it in gaming as much as in daily use.
 
Last edited:
Overclock the Q9550 to 4.0 Ghz? Shouldn't be very hard with a Gigabyte P45T motherboard, you might even be able to hit 4.2 which should give you 30% more performance.
 
Q6600 stock = 2.4ghz
Q9550 stock = 2.8ghz
I'm running = 3.2ghz on both w/ vid cards @ stock speeds

If the vid card memory is fully used, I think the game will also use system ram, but I am not sure.
It's been a while since I was really reading up on that kinda' stuff...

Either way, the extra 4GB in daily use is totally worth it.
When I went from 4GB to 8GB, I don't remember noticing it in gaming as much as in daily use.

I ended up putting in a bid for 4 x 2GB memory on ebay. Hopefully that helps out makes my system a little less sluggish.

Maybe the combination of you having double my memory, having 15% higher CPU speed, and a 6850 versus my 5870 is the reason why mine seems to run so slow.

Overclock the Q9550 to 4.0 Ghz? Shouldn't be very hard with a Gigabyte P45T motherboard, you might even be able to hit 4.2 which should give you 30% more performance.

Do you know if my Gigabyte DS3 is a poor overclocker? I previously had a e6420 with this motherboard and was able to get only 2.8 GHz stable. When I upgraded to the q9550, I was not able to overclock *at all*. I guess there's a chance it could be the type of memory I have. After playing with the settings for a few hours, I decided to just live with the stock 2.83 clock.
 
Installed the card tonight. It's gigantic! Had to move around a hard drive and Re route several cables to get it to fit. R9 280

Tomb Raider
1080p
Normal 162.2
High 118.7
Ultra 85.1
Ultimate 56.9

1440p
Normal 100.3
High 73.9
Ultra 52.9
Ultimate 36.4

Dragon age inquisition

1080p
Medium 56.8 Ave 42.6 min
High 33.3 Ave 17.7 min
Ultra 35.1 Ave 29.4 min

Firestrike
5725

Notice how similar our 1080p numbers are now. It's as if the cpu didn't make a difference.

Super impressed that a sub $200 card can make my eight year old motherboard run like new! Despite manufacturer requirements of 750w psu, my 520 seems to work just fine so far (knock on wood) once I got the six to eight cable.
 
Good going. People always cry CPU bottleneck before realizing simply adding more GPU is cheaper than adding GPU and CPU (new mobo, etc.). Some people have hard budgets to upgrade a system at a certain time. At 1080p and above, you really are more dependent on GPU scaling anyways.
 
Installed the card tonight. It's gigantic! Had to move around a hard drive and Re route several cables to get it to fit. R9 280

Tomb Raider
1080p
Normal 162.2
High 118.7
Ultra 85.1
Ultimate 56.9

1440p
Normal 100.3
High 73.9
Ultra 52.9
Ultimate 36.4

Dragon age inquisition

1080p
Medium 56.8 Ave 42.6 min
High 33.3 Ave 17.7 min
Ultra 35.1 Ave 29.4 min

Firestrike
5725

Notice how similar our 1080p numbers are now. It's as if the cpu didn't make a difference.

Super impressed that a sub $200 card can make my eight year old motherboard run like new! Despite manufacturer requirements of 750w psu, my 520 seems to work just fine so far (knock on wood) once I got the six to eight cable.

thats because the CPU make little to none difference in tomb raider 2013 (specially in the benchmark).. you can run the game in two cores flawlessly.. in fact my chip run underclocked between 1600mhz and 2400mhz while the benchmark.. i've tested lot of times with and without HyperThreading and with 2 and 4 cores.. no difference just like This Pic show..

If you guys want to test really the CPU + GPU power you really need to test another game.. Crysis 3 would be a good start point (and i think the better one).. Hitman absolution its also good, as is far cry 3 (or the more recent far cry 4, Grid 2 or Just cause 2 are also very good for CPU Scalability testing as all of those react very well to speed and cores.. just as examples..

but also to add a bit more info I've ran a quick tomb Raider Test with my card at stock clocks to be more similar (280X)

Average:
Normal: |195.5|
High: |136.7|
Ultra: |101.0|
Ultimate: |67.6 |

similar numbers corresponding enough to the jump of a 280 to a 280X.. its a well coded game for CPU =)..
 
Despite manufacturer requirements of 750w psu, my 520 seems to work just fine so far (knock on wood) once I got the six to eight cable.

I think i be keeping an close eye on those voltage rails with you favorite choice of app and possibly even breaking out the multimeter just to verify. longs as the rails are in spec you be ok for a while:)
 
I think i be keeping an close eye on those voltage rails with you favorite choice of app and possibly even breaking out the multimeter just to verify. longs as the rails are in spec you be ok for a while:)

Sorry, no multimeter. :(

And like I said, I'm probably just going to live on the edge and just build a new system if this breaks my computer.

What's the worst that can happen by the way? PSU blows up? Motherboard fry's? GPU craps out? A fire?
 
Sorry, no multimeter. :(

And like I said, I'm probably just going to live on the edge and just build a new system if this breaks my computer.

What's the worst that can happen by the way? PSU blows up? Motherboard fry's? GPU craps out? A fire?

All the above.....but less chance with a good name brand one of course....now if your not even the least bit curious if your rails are dropping then thats your choice...you don't really need a multi meter in all cases....software apps are pretty much right on what what my fluke meter says on mine;) HWiNFO64 is a nice free app for reading some many sensor's including all your rails....just run something like kombuster on the cpu and gpu at the same time and take a look...its more than likely just fine anyway
 
All the above.....but less chance with a good name brand one of course....now if your not even the least bit curious if your rails are dropping then thats your choice...you don't really need a multi meter in all cases....software apps are pretty much right on what what my fluke meter says on mine;) HWiNFO64 is a nice free app for reading some many sensor's including all your rails....just run something like kombuster on the cpu and gpu at the same time and take a look...its more than likely just fine anyway

Oh... Didn't know you could check with just software. I'll download it when I go to the computer later and test it out. Thanks!
 
Don't know where to look in the app (HWiNFO64)?

you click on the app in the system tray and click on sensors....it will bring up something like this which includes just about everything you could possibly need to know sensor wise...all for free

Capture_zps87844c80.png
[/URL][/IMG]
this would be your power supply rails being most noteworthy
scroll thu it till you find the sensors you need info on like power rails, temps or whatever you happen to need. its been my go to program for a while when wanting to check certain things
 
Cool, thx a lot!!!!! Here is what I am getting:

Current // Min // Max // Average
3.360 // 3.360 // 3.376 // 3.361
4.812 // 4.785 // 4.812 // 4.811
11.795 // 11.732 // 11.859 // 11.808

Should I be worried that i am not getting a full 5v & 12v?
 
Interesting... I am very close to the edge...

5v minus 5% = 4.75v
12v minus 5% = 11.4v

Current // Min // Max // Average
3.360 // 3.360 // 3.376 // 3.361
4.812 // 4.785 // 4.812 // 4.811
11.795 // 11.732 // 11.859 // 11.808

I can flip my vid cards around from Q6600 to Q9550
~ the boxes are in the wrong rooms... but I can deal w/ it...

Q6600 = Corsair HX620W running the R9-280-3GB
Q9550 = Rosewill RBR1000M running the HD6870-1GB

Sounds like if I swap my cards around I will be OK.
 
Hey Tzzird, did you get that extra 4GB ram yet?

I have not swapped my cards between my PC's yet.

The Q6600 is at my desk PC & Q9550 is at my Playseat racing PC.
The seat padding is crushed & I'm bored with racing lately, so FPS @ the desk wins!

Q6600 = runs a 37" Visio @ 1080
Q9550 = runs a 47" Phillips @ 1080
 
I am in a similar boat, I want to upgrade the GPU on my Q9550 system mainly for BF4, and I can't really justify spending the money upgrading the cpu/mobo/ram and GPU since I also need a HD or SSD for it. the 500gb HD in it is failing.

I am also running an LG 25" 21:9 screen, 2560x1080. After using this screen, I don't want to go back to a square 16:9 screen (I know it's not square but looks really square after using the ultrawide).

If need be, I'd upgrade my main rig and move the 7970 down to the Q9550, but I would rather just get a card for this one.
current specs:
Q9550 @ stock clock
8GB Ram
XFX 6950 2GB
 
That 6950-2GB isn't cutting it for BF4 ? That game must be nuts.

What about EVERYTHING ELSE?
That should play everything much better than my 6870-1GB.
Do you have Tomb Raider? Can you try the benchmark & post results?
$10 today @ http://store.steampowered.com/app/203160/

I wouldn't spend money on a video card. I would overclock the Q9550 first if the mobo can do it.
~ I run CoreTemp in my systray at all times and watch for screwy temps.

I had to over volt my CPU it a little and adjust my ram speeds in my BIOS.
~ I also have a good heatsink / fan = Xigmatek Dark Knight 120mm

I can tell immediately (just browsing the web) if my system BIOS reset to default CPU speeds.
Windows update chokes sometimes or I have a (rare) heat spike / random reboot that trips my BIOS to default.

You will be surprised how much that CPU bump from 2.8 to 3.2 helps & "feels" - even though it's only 13.33%
 
I'd need to get a cooler to overclock, fan on the stock cooler is on it's way out. I'll see what deals I can find on one. My case is the HAF-XB cube so it should be able to fit any air cooler on the market.

the higher res monitor is putting a load on the GPU in BF4, it's 2560x1080. The 6950 worked ok on my i7 machine at 1080P.

I just bought Tomb Raider, I am guessing there is a Benchmark section in the menus?
 
I get these frames at 2560x1080.

Normal
min 56
max 85
avg 73

Ultimate
min 17
max 32
avg 23

1920x1080
Normal
min 66
max 92
avg 86

Ultimate
min 22
max 35
avg 28
 
Last edited:
Thanks a lot, that is pretty interesting...

I've isolated your average fps results for comparison to mine @ 1920x1080:

Q9550 @ stock (2.8ghz) + 8GB Ram + XFX 6950 2GB
Normal = avg 86 fps
Ultimate = avg 28 fps


Q9550 @ OC'd (3.2ghz) + 8gb RAM + HD-6870-1gb
@ Normal = 89 fps
@ Ultimate = 25 fps

Looks like my overclocked CPU makes up the difference in our vid cards?

Someone mentioned previously that Tomb Raider is not a CPU intensive game tho...

My older CPU w/ new card just for reference:
Q6600 @ 3.2ghz OC'd + 8gb RAM + R9-280-3gb
@ Normal = 160 fps
@ Ultimate = 56 fps
 
I'll install the game upstairs and see what my main rig does compared to your Q6600 since 7970 is I believe the equivalent of the 280.
 
on my i7 3770k @ stock speeds with the 7970 @ 1080p

Normal = 170fps
Ultimate = 56fps.
 
I'll install the game upstairs and see what my main rig does compared to your Q6600 since 7970 is I believe the equivalent of the 280.

on my i7 3770k @ stock speeds with the 7970 @ 1080p

Normal = 170fps
Ultimate = 56fps.

you should be getting better performance... the 7970 its equivalent to a 280X not a 280.. and the performance even guessing you are at stock clocks should be better.. its a non GHZ edition?.. what settings are you using to bench? are you changing any setting in specific?
 
Last edited:
I just loaded up the game and turned off vsync and ran the benchmarks.
I have just a normal XFX 7970. I do have 3 monitors, but run the game on the middle one, the side screens just have the wallpapers showing.
Computer has been on for 5 days since last reboot, not sure if that makes a difference.
 
I just loaded up the game and turned off vsync and ran the benchmarks.
I have just a normal XFX 7970. I do have 3 monitors, but run the game on the middle one, the side screens just have the wallpapers showing.
Computer has been on for 5 days since last reboot, not sure if that makes a difference.

that possible can explain the differences from your results and my results.. my 280X at stock clocks(1100/1500) are:

Normal: |195.5|
High: |136.7|
Ultra: |101.0|
Ultimate: |67.6 |

I didn't imagined the difference in the clocks could make such difference in performance as the rest of our systems are kinda similar.. 10+ average FPS(ultimate) can make a great difference in the gaming experience with any game..
 
my CPU is also running stock clocks as well. nothing is overclocked on this machine.
 
my CPU is also running stock clocks as well. nothing is overclocked on this machine.

well running the benchmark in that game in fact my chip run underclocked at 1600mhz-3300mhz so i hardly doubt it can make any difference from stock to overclock in that game.. its just probably the difference in clocks as i know the 7970 its very sensitive to gpu clock..
 
Hey Tzzird, did you get that extra 4GB ram yet?

I have not swapped my cards between my PC's yet.

The Q6600 is at my desk PC & Q9550 is at my Playseat racing PC.
The seat padding is crushed & I'm bored with racing lately, so FPS @ the desk wins!

Q6600 = runs a 37" Visio @ 1080
Q9550 = runs a 47" Phillips @ 1080

I did! As you said, it doesn't really make a difference in gaming. Benchmarks pretty much come out the same with 4 GB of ram.

But with day to day desktop work, it feels soooooo much more fluid. I've been sitting on 4x1GB chips for 8 years now, should have upgraded to 4 x 2GB chips a long time ago.

Oh, and I use a 1440p as my desktop monitor, but I just upgraded my secondary monitor of 37" Westinghouse from 8 years ago to a 50" Westinghouse for $349 during Black Friday! Should've done this a long time ago. 3 HDMI inputs now instead of 1, not to mention clearer, crisper display. Can't believe I paid $800 for that 37" back then.
 
I got my drives and cpu cooler for the Q9550 today.
overclocked the cpu to 3.4ghz, seems to run stable, low 60's with prime95.

I'll run the benches again after the games copy to the new drive.

EDIT: I just noticed that my 6950 is running @ 4x instead of 16x. I am using the lower slot on the mobo due to the GPU covering up most of the sata ports. :(
Both PCI-e say 16x on them though, figured it would be 16x for one card in either slot and 16/4 if you use both.
this is the board I have,
http://www.asus.com/Motherboards/P5KE/

I guess this says that both slots are x16 size but the black one only runs at 4x or 1x.
Expansion Slots
2 x PCIe x16 (blue @ x16 mode, black @ x4 or x1 mode) supports
2 x PCIe x1
3 x PCI

I have a SATA card upstairs, I'll just put that in and move the card to the x16 speed slot.
 
Last edited:
Q9550 @ 3.4Ghz and GPU moved to x16 slot.
1920x1080
Normal = 124.5 fps
Ultimate = 37.7 fps
 
I am trying to keep track of who's who & who's running what = to see the gains...

You were previously running like this: :)

Q9550 @ stock (2.8ghz) + 8GB Ram + XFX 6950 2GB @ 1080
Normal = avg 86 fps
Ultimate = avg 28 fps

Now your like this: ;)

Q9550 @ 3.4Ghz, 6950 GPU moved to x16 slot @1080
Normal = 124.5 fps
Ultimate = 37.7 fps

That is a pretty worthwhile bump for (probably) a $35~50 heatsink, eh?
I'm sure you gained a little from the PCI slot too.
Your games aren't running from SSD, right?

Anyway, your gains look like this: :D
Normal = avg 86 fps to 124 fps = EDIT: 44% gain
Ultimate = avg 28 fps to 37 fps = EDIT: 32% gain

PS: what are your Q9550 temps at 3.4ghz just browsing / youtube type use?
 
Last edited:
Soooooo.... that made me want to look at my % gains...

Before:
Q6600 + 8gb RAM + HD-4870-512mb @ 1080
@ Normal = 46 fps
@ High = 31 fps

After:
Q6600 + 8gb RAM + R9-280-3gb @ 1080
@ Normal = 160 fps
@ High = 117 fps
@ Ultra = 84 fps
@ Ultimate = 56 fps (playing at this)

My gains look like this:
@ Normal = 46 fps to 160 fps = EDIT: 248% gain
@ High = 31 fps to 117 fps = EDIT: 277% gain

PS: http://www.calculatorcat.com/free_calculators/percent_gain.phtml
 
Last edited:
My Steam Games are on a WD 2TB Green drive.
Next time I reboot, I'll try and remember to go into BIOS and run at stock clocks and see if the OC or the PCI slot move made the most difference.

Temps are in the low to mid 40's idling and playing youtube vids, one of the cores is warmer at 49-51, but the other 3 are low to mid 40's.

I put BF4 on my 512gb SSD and it plays better now, would stutter every 10 seconds or so with the bad hard drive.
 
Soooooo.... that made me want to look at my % gains...

Before:
Q6600 + 8gb RAM + HD-4870-512mb @ 1080
@ Normal = 46 fps
@ High = 31 fps

After:
Q6600 + 8gb RAM + R9-280-3gb @ 1080
@ Normal = 160 fps
@ High = 117 fps
@ Ultra = 84 fps
@ Ultimate = 56 fps (playing at this)

My gains look like this:
@ Normal = 46 fps to 160 fps = 110%
@ High = 31 fps to 117 fps = 116%

PS: http://www.calculatorsoup.com/calculators/algebra/percent-difference-calculator.php

your gains are almost 400%. 100% gain would be 46 -92.

Also, that calculator app looks broken. Percentage difference of 50 and 100 comes out to 66.66%???
 
You are right!!! I used % DIFFERENCE vs. % GAIN (habit = I use DIFF @ work)

Here is a percent GAIN calculator:
http://www.calculatorcat.com/free_calculators/percent_gain.phtml

I'll revise the #'s, thanks for catching that...

PS: There is an explanation of % diff on this page & it perplexes me too...
http://www.calculatorsoup.com/calculators/algebra/percent-difference-calculator.php

They have one called Percentage Change Calculator where 50~100 is a 100% change
http://www.calculatorsoup.com/calculators/algebra/percent-change-calculator.php
 
Last edited:
those percentages are still wrong I think.
should be 350% and 380% increases.
it's 3.5x faster on normal and 3.8x faster on high.
 
Back
Top