Asrock 775 Dual VSTA Incompatible with Nvidia 8800 series

the 4x PCI-E is limiting your throughput by at least 15%. If it were a full 16 lanes, or even 8, you would score in the 10K range which would be on par with 16x systems with your cpu/ram/video subsystem specs.
 
whats everyone 3dmark 06 score with THIS m/b?
my config is :
e6600 @ 2.66
2 Gigs DDR2 667 (PC2 5300)
evga 512 8800gt

my score is pretty crappy at 9100

My score in 3DMark06 is 9373 with:

=Windows Vista Ultimate 32-bit
=ASRock 775Dual-VSTA, BIOS: v3.00
=Intel Core 2 Duo [email protected] <--mobo limitation on the overclock
=eVGA 8800GT KO @stock speeds
=2x1GB DDR2-800MHz running at DDR2-533MHz

When I used to have my CPU & GPU on the Gigabyte P35C-DS3R, I broke 10,000 points.

-Optimummind
 
i guess compared to other users with the same board this is normal. btw, i just rma'ed my evga and my temp has dropped dramatically with stock coolers.
no load = 40C
full load = 55C
it used to reach 80c with my old evga because i had the old model with the smaller fan
 
3dmark06 Score:

8800GTS SSC (G80) at stock speeds

Default: 8750

4XAA 16XAS: 8370
 
You have to enable AA in the driver control panels for both the ATI and Nvidia cards to do AA with HDR in Oblivion. Performance with the x1950 was okay but not great. With the 8800, i run oblivion at 1680x1050 with 4xMSAA,16xAS, HDR and everything turned on and maxed out and its very smooth.

Gotta love these cards.

Remember how Oblivion was the bane of the existence of the 7900/x1950 series cards? How things have changed with the 8800 series! I bet you keep supersampling transparency AA on as well? I see you dont get much of a hit on 3dmark06 with AA and AF turned on from the control panel... that is what swung me into getting the 8800GTS 640... itll be a nice upgrade from my 7950... I probably wont double my 3DMark06 score... but a 3000 pt gain would be significant!

Right now I get around 5500 at stock, and just under 5000 with 4xaa/8xaf.
 
I haven't tried supersampling AA but i remember how much of a performance drain it was on the x1950pro.
 
btw, i finally installed Crysis on Vista64. Image quality is excellent when set to very high (DX10) and the first level is actually pretty playable. I was surprised the game played as well as it did. The XP tweaked version is definitely missing some of the pop and fidelity of the very high DX10 version but the speed gains make the trade-off well worth it.
 
hello, Leadtek's support sent me today modified bios to Gen1 and I can confirm functionality with Leadtek PX8800 GT 512MB on ASRock 4-core dual VSTA @ PCI-E x4. It's a version 62.92.1F.00.00 (12/27/2007). my card was shipped originally with gen2 bios 62.82.16.00.36 (12/04/2007) and NiBiTor 3.7 was unable to switch it to Gen1 so I wrote to Leadtek support and on friday I sent them my warranty nr. of the card and monday morning I had the bios in my email. fantastic support! previously I used Asus Gen1 bios mentioned in this forum - worked also very well. because I don't know how to attach the file to my post, I'm putting here the link to the mvktech site where I posted it: Leadtek 8800 GT 512MB gen1 bios
 
nice link! ... well it depends, my own experience (while compared to similiar HW but with pci-E x16) is 4-14% drop (depends on game you're playing) - but you don't care if you've got 150fps or 120fps. the main thing is that I'm able to play all my favorite games at 1680x1050 with max. AA and full details. I played also Crysis demo (won't buy full game, I don't like that much as UT3) and let the game to detect best settings and it has chosen 1680x1050 with medium details and it ran absolutely OK. I should maybe try higher settings then ;) is there some benchmark for crysis demo ?
 
the demo does come with two benchmark executables that are widely used on benchmarking sites.
 
I'm a bit new this whole system building thing so please bear with me..

I went and grabbed a 8800gt off Ebay for my current system which contains an Asrock 4CoreDual-SATA2 mobo. Not realising the compatibility issues (silly me!), I received the card yesterday and then attempted to place it into my computer, of course to no avail. Searching for an answer I found this thread. I read all the way through it, coming to the conclusion I either need a new mobo, or I need to flash the BIOS of my card to Gen1 - Correct-ish??

Now, only having two computers, one of which is an old desktop without even a PCI-E slot, I figure buying a new mobo would be the 'easiest' option. I can't think of anyone I know who has a decent desktop computer either, so simply popping it into a mates computer isn't really an option.

I don't mind buying a new mobo if it's going to benefit me in the future, and was looking at this one http://www.ebuyer.com/product/131173/show_product_specifications
I'm sure it's not the best mobo around but It should support my GEN2 8800gt right?

If anyone could get back to me on this i'd be very greatful.
 
I don't mind buying a new mobo if it's going to benefit me in the future, and was looking at this one http://www.ebuyer.com/product/131173/show_product_specifications
I'm sure it's not the best mobo around but It should support my GEN2 8800gt right?

If anyone could get back to me on this i'd be very greatful.

Yes, that mobo will support the 8800GT. That is the same mobo I used to flash my eVGA 8800GT GEN2 to a GEN1 card. Overclocking options are good too, enabling me to overclock my Core 2 Duo E6600 2.4GHz to 3.2GHz.

-Optimummind
 
I dont know if you've read this already, but...

http://www.tgdaily.com/content/view/35641/118/

Seems like MS has moved up the release of Windows 7, to next year!

Can we now say that Vista has been the biggest OS failure since Windows ME? I was going to "upgrade" to Vista, but what's the sense if the next one is coming out in 2009?
 
Cheers m8, i've ordered it now! Will be the first mobo transplant I have yet performed so wish me luck ;-) .
 
Update:

AsRock now offers a bios flash update for ALL their dual boards based on intel cpus to be able to use the 3870/3850 PCIE 2 cards in XP.

http://www.asrock.com/mb/download.asp?Model=4CoreDual-SATA2&s=

Wish they would have done this for Geforce Gen 2 cards too!

I might end up getting a 3870 after all, especially considering stuff Im reading on Guru3D concerning their image quality being better (especially with AA enabled). Still undecided lol.
 
from the benchmarks i've seen, the 8800 series is much faster with AA than the 3870/3850 cards. Newer catalyst drivers seem to be closing the gap, though.
 
Yeah, Im a bit down on nvidia right now bc my 7950GT card stopped sending signals to the dvi port. Im getting a 'no signal' message on both ports. I happened to have a GeCube x1950pro 512mb card handy and thats what im using right now...the colors seem to be much more vivid and saturated with this card than the 7950GT and the 6600GT which I had before that (they also blew out highlights in my pictures, but when I turned down brightness, I couldnt see shadow detail...I think they had a lower dynamic range than the x1950pro)...you can really tell the difference. It benched 5500 on 3DMark06 and 5000 with 4xAA/8xAF with both temporal and adaptive AA enabled and AF set to HQ and full trilinear turned on. Hopefully, the 8800GTS has vivid color similar to the x1950pro and not like the Geforce 7 series. Since youve had the x1950pro and now the 8800GTS, you'd probably know if the desktop colors and saturation of colors and dynamic range of the 8800GTS matches the x1950pro.
 
I consider myself a pretty observant person and I too haven't noticed any image quality differences between my current 8800GT and my previous 1900XT 512MB I used for 10 months.

I've owned these cards in the following order and the only difference I've noticed is performance gains. =)

*Radeon 7000 64MB --> GeForce Ti-4200 64MB --> Radeon 9800 Pro 128MB --> GeForce 6800GT 256 MB--> Radeon 1900XT 512MB --> 8800GT 512MB
 
Good to know...I feel better about the 8800 now, I always liked the nvidia drivers better.

We've had some of the same upgrade paths. Mine went something like this:

Geforce 2 GTS 32MB (Ironic eh, my first GPU was also a GTS?)
Ati Rage Fury Maxx 64MB (2 GPU on one chip!)
Radeon 8500 LE 64MB
Geforce 4 Ti 4200 128MB (Had this from 2002-05)
Geforce 6600GT 128MB (Had this from 2005-07)
Geforce 7950GT 512MB
Radeon x1950Pro 512MB

Image quality across dual displays is pretty important to me, since the Geforce 4 Ti 4200, I've always had dual displays, but until the 6600GT, Ive never had dual dvi ports (and always wanted them)... theyve become standard now!
 
3dfx voodoo 1
3dfx voodoo 2
3dfx voodoo 3
geforce 2mx
geforce 3 ti200
radeon 9500
radeon 9800pro
geforce 6600gt
geforce 7800gs
radeon x1950pro
8800GTS SSC
 
Ive heard the voodoos had some amazing image quality, especially for so early on. Is this true?
 
the Voodoo3 was the first 3dfx card that could do 32bit color and not require a passthrough but was painfully slow compared to the TNT2.

Oh, I forgot to put my first GPU on there, the Rage Pro... the good ole Mac days.

Okay, we are now making this thread entirely too long than it needs to be! :rolleyes:

edit:
MC6847 = my first desktop computer GPU
 
Is there ANY chance that Asrock is going to release a BIOS update that will make this mobo compatible with 8800GTs? I am nearing the end of my step up window with eVGA, and if I can't get a really worthwhile card that will work with my board, I'm going to have to eat the excessive full retail price I paid for my 8600GT. I don't have the cash to upgrade mobo, RAM, and GPU all at once.
 
I just ordered the 8800GTS 640 from new egg! Should be here by about Thursday. I had a question meanwhile. I saw a thread on oc where people were talking about shutting down the pcie bidirectional bus with the 8800GTS in bios (I think it was you, stickboy) to cut down on artifacting in games. Does this remove artifacting completely and where is the setting in bios, I cant find it.
 
Its the PCIE downstream pipeline option. It should be in the chipset options. With it enabled, certain games, especially STALKER, would produce tons of graphic errors- the kind you usually see when the core or memory are clocked too high. Crysis also produced some graphic corruption with the setting on. My 3dmark06 scores were actually higher with the option off. Make sure and get the newest Nvidia drivers, too.

Did you get the SSC version?
 
EVGA has posted a bios for their G92 8800GTS card that is supposed to make it compatible with the PT880pro/ultra 4coredual type motherboards. And since they include the newest BIOS on their shipping cards, I would think they would be compatible with the 4coredual without flashing.

Edit- this is not a Gen1 BIOS but a new version of their shipping Gen2 BIOS that has Gen1 compatibility built in that supposedly works on a 4coredual-sata2 according to one user.
 
EVGA has posted a bios for their G92 8800GTS card that is supposed to make it compatible with the PT880pro/ultra 4coredual type motherboards. And since they include the newest BIOS on their shipping cards, I would think they would be compatible with the 4coredual without flashing.

Edit- this is not a Gen1 BIOS but a new version of their shipping Gen2 BIOS that has Gen1 compatibility built in that supposedly works on a 4coredual-sata2 according to one user.

Cool! If they did the same to the 8800GT before my stepup expires, I'd be in good shape! Where did you see the user's experience, on the eVGA forums?
 
let me know what happens...i have 21 days left in my step up window. Unfortunately, EVGA only offers the vanilla 8800GTS 512 and not the KO or SSC version, currently, so I don't have much reason to upgrade, yet.
 
Ok, I searched high and low and still cant find that pcie setting for turning off downstream pipeline. From what I understand, this is absolutely necessary to stop artifacting with the 8800GTS and the 3870 cards. I went into advanced and chipset config but it just isnt there. Can anyone point out where it is in the list? I have everything set to auto, including the pcie clock...maybe thats the problem?
 
BTW, is the 8800GTS 512MB alot better than the 8800GTS 640MB? I mean, if the difference is 10% or less, I really dont see a need in "upgrading" to that.
 
The vanilla 8800GTS 512 (G92) would not be that much faster than the G80 8800GTS 640 SSC (112 SP model). The larger memory and bus on the G80 card helps it to catch up to the newer card at high resolutions and with AA enabled. But the KO or SSC version of the GTS 512 card would probably be 10-20% faster in certain situations. In Crysis, the GTS 512 does very well and beats the GTX in most benchmarks.

Considering I paid $369 for my 8800GTS SSC, I would only have to pay the $6 for shipping to get the vanilla 8800GTS 512 that costs the same price. I would be willing to pay a little extra to get the added performance of the KO or SSC model.

The G92 GTS 512 is alot better than the standard G80 8800GTS 640.
 
Hey stickboy, can you show me how to change pcie downstream piping? Like, what submenu is it under and what are the settings just above and below? Is everything else (including pcie clock) set to auto? I cant figure out why I cant find it!
 
i'll take a look at my BIOS the next time i'm on the computer.

That's a good card for the price. Almost as fast as a GTX and guaranteed to work with your mobo. I'd try overclocking it with RivaTuner to see what you can get out of it.
 
Back
Top