Crysis (GTX295 Quad SLI) on stock 920

psyside

2[H]4U
Joined
Apr 23, 2009
Messages
2,243
I'm planing to get 2x GT300 as soon as they are in stock however,i'm having hard time
to get stable my oc and i was thinking if someone of you guys would be so kind to do
some benches for me with oc and stock i7 920 on 1920 x 1200 enthusiast .So i would know how
much of a bottleneck i would experience with 2x GT300 they should be similar in terms
of performance to an GTX295 SLI.And as far as i know at 1920 x 1200 the real bottleneck
are the cards knowing Crysis is gpu bound game Thanks in advance guys!

I know that i can't really compare the GTX295 to GT300 but still ill get the idea ;)

Thanks one more time!
 
with a 920 on stock.. you would probably bottle neck 2 GT300 cards @ 1920x1200.. also note why are you going to waste 1200 dollars for 2 cards that wouldnt even be under 25% load with a 1920x1200 display.. well unless you like just spending money.. minimum you want is 2560x1600 with 2 GT300's.. anything less then that resolution is a waste of money.. since 2 gtx 285's can run crysis maxed out completely at 1920x1200..

2 gtx 285's would do just as good as 2 GT300 cards at 1920x1200.. quad SLI is worthless.. SLI doesnt scale for shit passed 3 cards.. and no 2 gtx 295's wouldnt even closely show the performance of 2 GT300's since 3 gtx 285's are faster then 2 gtx 295's..

woot for miss informed people..

but you need to overclock that cpu anyways... minimum you want is 3.2ghz.. preferably 3.4ghz or higher if you plan to use SLI with those cards.. but theres no reason to waste your money.. just go with a single GT300 card and leave the cpu at stock if you have a bad overclocking chip.. but this is all relative since nvidia hasnt released any performance numbers and for all we know theres only a 10-20% gain in performance with the GT300 vs the GTX 200's (unlikely, but possible)..
 
Last edited:
I don't think a stock i7 would be the bottleneck with two GTX295, not at 1920x1200 on Crysis. Especially if you factor in AA. Either way I agree that such GPU power is a rather overkill for that res, but would be great for 2560x1600.

We don't know hard specs of the GT300 so trying to compare the two is futile.
 
Its not overkill guys..cause i use mods and extreme settings..mass supersampling aa
af high textures etc and i dont want to go to low 25/30 fps in mass fights etc.Trust me i know what i'm doing just wanted to see some bench.You can never have enough power :cool:

Thanks for anwsers anyway ;)
 
Guess you didn't understand me monkey i'm not saying that GTX295 quad is good or it scale!
nice just it takes good cpu/hardware to drive that.

I'm talking about the amount of power required to use quad SLI cause its the most powerful
vga setup at the moment,so its kinda "close" to GT300 sli hope that clear things out a bit.
 
wot for miss informed people..

http://www.guru3d.com/article/geforce-gtx-285-review--3way-sli/17

I see 285 SLI BARELY getting ~60 fps average at 1920x1200 on "high" (NOT "very high") settings with only 2x AA. That's not exactly "maxed out".


exactly.. if its get a 60FPS avg in crysis with only those settings.. then they obviously arent maxing them out on purpose because its an apples to apples test.. the way the games designed its suppose to run between 30-35fps.. thats how the engine is scaled.. what people are missing is that theres 0 benefit to running anything over 35fps.. with the crytek engine its all about minimum frame rates and trying to keep a constant frame rate.. even if your running 60fps but it still dips in to the teens.. then the 60fps doesnt mean anything.. though i still blame most of the scaling issues with SLI on the devs for that game.. though they have fixed them some what.. there are still problems with it that will never be fixed..


the one fact your missing is that while the GT300 gpu will be better then any of the GTX cards.. AA is all about memory.. so even if they have the fastest memory possible on the GT300's.. if its only a 1GB card.. there still isnt enough space to add insane amount of AA.. thats the only reason the GTX 295 is better then any of the other GTX cards.. with its 2 gigs of ram per card it can make up for how slow it is by having the extra storage for AA.. for the GT300 to make up that difference it would have to be a 2GB card.. or even a 1.5GB card.. so unless that happens.. it wont make a difference if you had the gtx 285 or the GT300 at 1920x1200.. you would just end up getting better frame rates running the same exact settings you would be running with the GTX 285..
 
If you are going to spend that much money on video cards why would you not buy a better monitor or cpu?
 
exactly.. if its get a 60FPS avg in crysis with only those settings.. then they obviously arent maxing them out on purpose because its an apples to apples test.. the way the games designed its suppose to run between 30-35fps.. thats how the engine is scaled.. what people are missing is that theres 0 benefit to running anything over 35fps.. with the crytek engine its all about minimum frame rates and trying to keep a constant frame rate.. even if your running 60fps but it still dips in to the teens.. then the 60fps doesnt mean anything.. though i still blame most of the scaling issues with SLI on the devs for that game.. though they have fixed them some what.. there are still problems with it that will never be fixed..


the one fact your missing is that while the GT300 gpu will be better then any of the GTX cards.. AA is all about memory.. so even if they have the fastest memory possible on the GT300's.. if its only a 1GB card.. there still isnt enough space to add insane amount of AA.. thats the only reason the GTX 295 is better then any of the other GTX cards.. with its 2 gigs of ram per card it can make up for how slow it is by having the extra storage for AA.. for the GT300 to make up that difference it would have to be a 2GB card.. or even a 1.5GB card.. so unless that happens.. it wont make a difference if you had the gtx 285 or the GT300 at 1920x1200.. you would just end up getting better frame rates running the same exact settings you would be running with the GTX 285..

So your saying that the minimum fps will be same with GT300 SLI compared with GTX295 SLI or GTX285 SLI? sorry don't think so.After all its not just the memory,shaders also help.But yes i will agree that if GT300 ain't come with 2GB vram it will be big mistake from Nvidia we are already long enough on 1GB already.
 
thats the only reason the GTX 295 is better then any of the other GTX cards.. with its 2 gigs of ram per card it can make up for how slow it is by having the extra storage for AA..

Wrong. The GTX 295 has 896MB of RAM per GPU. The contents of RAM for each GPU must be identical, and cannot be shared between GPUs, so the GTX 295 is effectively an 896MB card. The same is true for 4870X2 cards (Edit: except that the 4870X2 has 1GB per GPU).

There are a few single-GPU cards with more than 1GB RAM - Vapor-x makes a 4870 2GB, and eVGA makes a 1792MB GTX 275, and there are four different GTX 285 2GB cards, here.
 
Last edited:
Wrong. The GTX 295 has 896MB of RAM per GPU. The contents of RAM for each GPU must be identical, and cannot be shared between GPUs, so the GTX 295 is effectively an 896MB card. The same is true for 4870X2 cards.

There are a few single-GPU cards with more than 1GB RAM - Vapor-x makes a 4870 2GB, and eVGA makes a 1792MB GTX 275, and there are four different GTX 285 2GB cards, here.

Im very against the GTX295 for that exact reason. I think dual gpu cards are a waste.
 
Don't know if this is the right place but it was kinda stupid to open new thread for this :)

Need someone with 2x or 1 5870 to do some benchs for me on the same settings/demos like me the idea is to compare my ol GTX295 with 5870 generally.I want the exact bench no matter if the real gameplay tests are much better and yes i know that! [H] reviewes rox but thats not the same test/metod of using sadly.




The reason this is very important for me is that i ordered 2x5870 they should be here in a week or so.The point is i want to compare my old GTX295 with my future upgrade.Ofcourse i know it will be faster but how much? So long story in short, this is my testing metod/settings.I could wait and test for my self but i can't wait :eek:

Crysis warhead:

1920 x 1200 Enthusiast,DX10 NO AA & AF.

Codepath:Ambush,3 loops = 5870 Crossfire

Test#2


1920 x 1200 Enthusiast,DX10 4xAA 16xAF

Codepath:Ambush,3 loops = 5870 Crossfire


Test #3


1920x1200 DX10,Enthusiast NO AA & AF

Codepath:Ambush,3 loops = single 5870.

Test #4

1920x1200 DX10,Enthusiast 4xAA & 16xAF

Codepath:Ambush,3 loops=single 5870


Guys,you will help me alot if you do this tests thanks in advance! Oh and forgot to mention now i have my i7stable @ 3.8G no bottleneck i guess.
 
Don't know if this is the right place but it was kinda stupid to open new thread for this :)

Need someone with 2x or 1 5870 to do some benchs for me on the same settings/demos like me the idea is to compare my ol GTX295 with 5870 generally.I want the exact bench no matter if the real gameplay tests are much better and yes i know that! [H] reviewes rox but thats not the same test/metod of using sadly.




The reason this is very important for me is that i ordered 2x5870 they should be here in a week or so.The point is i want to compare my old GTX295 with my future upgrade.Ofcourse i know it will be faster but how much? So long story in short, this is my testing metod/settings.I could wait and test for my self but i can't wait :eek:

Crysis warhead:

1920 x 1200 Enthusiast,DX10 NO AA & AF.

Codepath:Ambush,3 loops = 5870 Crossfire

Test#2


1920 x 1200 Enthusiast,DX10 4xAA 16xAF

Codepath:Ambush,3 loops = 5870 Crossfire


Test #3


1920x1200 DX10,Enthusiast NO AA & AF

Codepath:Ambush,3 loops = single 5870.

Test #4

1920x1200 DX10,Enthusiast 4xAA & 16xAF

Codepath:Ambush,3 loops=single 5870


Guys,you will help me alot if you do this tests thanks in advance! Oh and forgot to mention now i have my i7stable @ 3.8G no bottleneck i guess.

http://forum.beyond3d.com/showthread.php?t=49120&page=154

that should help ya, scroll down, someone did it for a forum member there.
 
Personally, I would rather get either the new HD58 cards or nVidia's upcoming cards

I would wait for the nVidia cards to arrive and see which one is the better one and spend on them

Unless of course you are in a hurry for some reason which leaves HD58 only for you
 
X-Fire hates Crysis, so here:

Processor: Intel(R) Core(TM) i7 CPU 965 @ 3.20GHz @ 3184 Mhz
CPU ID: Intel64 Family 6 Model 26 Stepping 4
Operating System: Microsoft Windows 7 Ultimate
Physical memory: 5.99 GB
Display adapter: ATI Radeon HD 5800 Series 1024 MB
Driver version: 8.660.0.0 (20090910000000.000000-000)

==============================================================
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 85.82s)
!TimeDemo Run 0 Finished.
Play Time: 52.38s, Average FPS: 38.18
Min FPS: 30.37 at frame 145, Max FPS: 56.34 at frame 1427
Average Tri/Sec: -30699514, Tri/Frame: -804032
Recorded/Played Tris ratio: 1.22
!TimeDemo Run 1 Finished.
Play Time: 47.75s, Average FPS: 41.89
Min FPS: 30.37 at frame 145, Max FPS: 56.34 at frame 1427
Average Tri/Sec: -32117038, Tri/Frame: -766745
Recorded/Played Tris ratio: 1.28
!TimeDemo Run 2 Finished.
Play Time: 47.30s, Average FPS: 42.28
Min FPS: 30.37 at frame 145, Max FPS: 58.00 at frame 1456
Average Tri/Sec: -33164016, Tri/Frame: -784357
Recorded/Played Tris ratio: 1.25
TimeDemo Play Ended, (3 Runs Performed)
==============================================================

//////////// Summary \\\\\\\\\\\\\

09/27/2009 01:52:36 - Microsoft Windows 7 Ultimate

DirectX 10 ENTHUSIAST 3X @ Map: airfield flythrough @ 8 1920 x 1080 AA 4xx
==> Framerate [ Min: 30.37 Max: 57.17 Avg: 42.09 ]

And my CPU is actaully at 3.8Ghz, just read wrong in Win 7 RC.
 
Hey m8, thanks alot for the benchs! higly appreciated! can you run another test but with different codepath? ambush,if your bored with it no problem this is already enough thanks!
 
Back
Top