1.1 is fine thats idle so it wont use more power but once u run a 3D application/game it will boost up to 3.0 if your board supports it because u mention 2.0 not 3.0, but im still running at x8 not x16
i had no issue with the card and my MB, it detects it everytime, the only issue is the x8 1.1 and when i run 3D application/games i just shows x8 3.0 never changes to x16, my thing is that running on x16 still gives a bit of performance increase over x8 which i also read online, check the pics i...
hey guys quick question, i have a gigabyte 7970 ghz ED, now how come gpuz only detects it as an x8 lane speed and not x16? just curious since i cant find threads on it, is there a way to change it to x16? my MB is an asrock z77 extreme4... PS: its on the first top pcix slot
Hello, im curious if the problem is with the motherboard or videocard, first of all my MB is an AsRock z77 extreme4, now i just got a new Gigabyte 7970 Ghz edition card which is pcie 3.0 and 16x, prior to this card i had a radeon 5870 pcie 2.0 and 16x now that card shows as (PCI-E 2.0x16@x16...
hey, im curious if the problem is with the motherboard or videocard, ok now i just got a new Gigabyte 7970 Ghz edition card which is pcie 3.0 and 16x, prior to this card i had a radeon 5870 pcie 2.0 and 16x now that card shows as (PCI-E 2.0x16@x16 2.0) with the GPUZ, now when i install my...
thanks for the advice.....to everyone to...and yea i be pushing my E6400 to 3.8ghz and 3.90ghz is the most i can run it on air with a Zalman 9700 and thats only on stable, but i leave it at 3.6ghz, plus thats all being powered by a 450w powersupply...aint that something...
i have my setup listed below...well i have an e6400 OC to 3.6ghz at the moment...and i was thinking of getting a q6600 my friend is selling to me for $160...is it worth upgrading my e6400 OC for the q6600? im confuse if i should get it or stick with what i have.....and 2nd if i get the q6600...
yes i did reset the system already and nothing...im thinking its the adapter i got...which i have the link for...i was thinking of buying the hdmi to dvi cable itself...which cable or device you using to connect ur PS3 to the monitor (same one i have right)?
Hey i have a 60GB PS3, and im using a 22" Samsung SyncMaster 226BW monitor, and i got from newegg an hdmi to dvi adapter.......ok i connected the adapter to the ps3 and the dvi cable is conneted to the other end of the adaptor and the other end of the cable to my 22" lcd...theres a connecting...
ok nvm guys i just figured it out, the dude email me and told me it has a hardware OC which he did himself from the stock 500mhz to a stable 620mhz clock and from 1600mhz memory to 1800mhz
by how hardware....cause it isnt by software he OC it...i got it from a dude off Craigslist which i had traded for something...i poped it into my machine and thats the clock speed i got...just found it odd that its clocked soo high compared to normal and even OC 8800 GTS 640Mb cards
no..it was given to me...but not even the OC models come clocked that high.....the OC models comes OC at around 575MHz max from what i've seen online and mines at 620mhz and im not even OC'ing it...so idk
i have a BFG GeForce 8800 GTS 640MB..and i read the specs online....and it shows that the gpu should be around 500mhz... but check this out...i use the program (GPU-Z 0.2.3)
and with my 8800 GTS 640MB is shows this
GPU clock: 620MHz Memory: 900MHz (1800Mhz) and Shader: 1300MHz...
Dude go with windows XP x64...im running that now..and i have no problems with it...cuz vista sucks...wait until the SP1 comes out for it...cuz u used vista premium 32bit...u see u getting 4GB, and any 32bit OS will only detect 3GB of that ram...i would stick with Windows XP x64...even games...
Hes right, the e6400 overclocks GREAT.... im running mine on air with a Zalman 9700..and my e6400 is overclocked at 3.8GHz from its stock 2.13GHz...and the processor is way faster especially when it comes to Photoshop CS3
yea i read that AMD 64bit processors are not truly 100% 64bit compatible unlike Intel which are...... AMD x64 are more of a blend between x86 and x64...i read that online, once i find the link again ill post it
im running a Core2duo E6400 OC'ed at 3.80GHz and a single 512MB Radeon X1950 XTX and i ran FRAPS...
~1280x1024/ all medium/ 4xAA,16xAF with HQ-AF on (enable AF and AF HQ with Catalyst Control center)~
Avg: 44.222 - Min: 33 - Max: 61
~ 1280x1024/ Very high(unlocked very...
If you hit Start/ all programs/ electronic arts/ crysis sp demo... its shows ( Play Crysis SP Demo, and Play Crysis SP Demo 64bit) is there a difference in speed and/or detail when playing the 64bit version cuz i havent notice, and im running a C2D and Windows XP x64
Don't take it.....check out this review...shows hows ur x1900xt takes out the x1950pro in just about every benchmark
http://xtreview.com/review154.htm
main thing the x1950 series has over ur card is the HDCP support
is there a difference between Crysis 32bit and 64bit?
If you hit Start/ all programs/ electronic arts/ crysis sp demo... its shows ( Play Crysis SP Demo, and Play Crysis SP Demo 64bit) is there a difference in speed and/or detail when playing the 64bit version cuz i havent notice, and im...
just curious what my card could do i ran this game on 800x600 with everything low and no AA and fran fraps and got this
Avg: 112.067 - Min: 86 - Max: 192
Then at Very high settings (after unlocking the very high settings) ran it at 1600x1200 (most my monitor can handle) and 6xAA with fraps...
For the quote above me...how u gonna have a nice setup like this
Core 2 X6800
8800 GTX
1 GB RAM
2056 x 1980 (30" display)
and ONLY have (1GB) of ram...wtf....lol how u expect to play crysis on high detail...when i open up my Task Manager my PC is showing about 1.2GB to 1.66GB of total...
well its not in the game but i have a Radeon X1950 XTX so i go to the "ATI Catalyst Control Center" and add 16xAF and HQ AF on ..so that way all games may play with AF on..
well that CPU needs to be upgraded fast....im running a Core2duo E6400 OC'ed at 3.80GHz and a 512MB Radeon X1950 XTX i ran the same settings as u did but with AA and AF on so imagine with the AA and AF off, and ran fraps this is what i got
Avg: 44.222 - Min: 33 - Max: 61 ~1280x960/ all...
wow thanks alot dude, i didnt know about that whole clock thing.... i had upgraded from an x850xt to this x1950xtx card..and on the 2d and 3d clocks both had the same clock speed on the x850xt ....strange...but hey thanks for the info..
PS: i ran crysis on my x850xt and then on my x1950xtx...
one last question..lol sorry to be bugging u with the questions but one last thing....When i open the Catalyst Control Center
and under "Information Center/ Graphics Hardware" it shows this
Core Clock in MHz 506 MHz
Memory Clock in MHz 594 MHz
does urs read it that way? and why is...
quick question to SiG357 and infin@ is the X1950XTX card and the X1950XTX CrossFire Edition card (which i have) indentical in everyway..? cuz my computer detects mine as an X1950 CrossFire and not as X1950XTX...just making sure it is and that the core speeds are the same...
thats odd....check out my specs below...i have the same card u have and at stock speed .
settings were: 1024x768 4xAA, 16xAF on all High settings
Avg: 18.711 - Min: 11 - Max: 24 ~~ single card
and thats with one x1950xtx, and a C2D... u have 2 x1950xtx and a Quad Core...run it with my...