Ok already release info on HIGH END cards due this year

Wouldn't be the first time a driver *gasp* breaks a game!

I already did blame Vista's poor SLI support ;).

IBNextUrGoin2NotThink

Sorry killer, a driver isnt going to kill SLI performance by 75% by accident, and Vistas poor SLI support *gasp* has nothing to do with it, because, once again, SLI is not supported in the Crysis demo, and *gasp* has been announced by nVidia, and *gasp* is common knowledge around the forums.

IBNextUrGoing2MakeMoreShitUpCuzUrBackedIn2aCornerWithNoWhere2Go

Its funny though, i dont even need to think to show your inconsistencies, all i need to do is click a few times and bam, the proof, as they say, is in the pudding, ala the great power of the search engine! Behold!

http://www.legitreviews.com/news/4061/
 
Sorry killer, a driver isnt going to kill SLI performance by 75% by accident, and Vistas poor SLI support *gasp* has nothing to do with it, because, once again, SLI is not supported in the Crysis demo, and *gasp* has been announced by nVidia, and *gasp* is common knowledge around the forums.

IBNextUrGoing2MakeMoreShitUpCuzUrBackedIn2aCornerWithNoWhere2Go

Its funny though, i dont even need to think to show your inconsistencies, all i need to do is click a few times and bam, the proof, as they say, is in the pudding, ala the great power of the search engine! Behold!

http://www.legitreviews.com/news/4061/
just to quote it for GoldenTiger. BTW I have lost respect for Gamespot

Crysis hasn't been out for more two days and a number of people have already found out that SLI technology doesn't yet work on the new DirectX10 game title. Granted, Crysis is only in beta, but it would still be nice to see SLI supported from the start. NVIDIA made a statement today that the final Crysis game will support SLI via a patch, so keep waiting! In other Crysis and NVIDIA news it seems that the Forceware 169.01 64-bit beta drivers aren't playing nice with the 64-bit version of the game. Looks like they have some more work do do with the 64-bit software.

"Although SLI is not supported in the demo, the final Crysis game with a patch will provide full SLI support." - NVIDIA
 
just to quote it for GoldenTiger. BTW I have lost respect for Gamespot

Crysis hasn't been out for more two days and a number of people have already found out that SLI technology doesn't yet work on the new DirectX10 game title. Granted, Crysis is only in beta, but it would still be nice to see SLI supported from the start. NVIDIA made a statement today that the final Crysis game will support SLI via a patch, so keep waiting! In other Crysis and NVIDIA news it seems that the Forceware 169.01 64-bit beta drivers aren't playing nice with the 64-bit version of the game. Looks like they have some more work do do with the 64-bit software.

"Although SLI is not supported in the demo, the final Crysis game with a patch will provide full SLI support." - NVIDIA

Who the hell posted that link? How dare he completely contradict GoldenTiger. GoldenTiger is the end all and be all of video cards and performance. Just look at the reviews he posted. Obviously SLI works, just the newer drivers remove 75% of performance thats all. Shame on him for not thinking. nVidia is full of shit, SLI gave 113% performance increase right from the get go.
 
Who the hell posted that link? How dare he completely contradict GoldenTiger. GoldenTiger is the end all and be all of video cards and performance. Just look at the reviews he posted. Obviously SLI works, just the newer drivers remove 75% of performance thats all. Shame on him for not thinking. nVidia is full of shit, SLI gave 113% performance increase right from the get go.
Now now lets not over do it. We all believe what we want to believe and I use gaming websites for guidance all the time . The Gamespot article would be convincing if I hadnt been following Crysis closely. I sent a message to James Yu asking him how he got SLI working for his article on Gamespot while even Nvidia says thats not possible. I doubt very seriously that I will hear back from him.
 
I wouldn't necessarily believe NVIDIA either. Remember when 8800s were released, and they said something to the effect of releasing an NTune with support for shader clock changes? How many months later is it now, and is it there? Nope, and it took a 3rd party tool (Riva Tuner) until just recently to do so.
 
Who the hell posted that link? How dare he completely contradict GoldenTiger. GoldenTiger is the end all and be all of video cards and performance. Just look at the reviews he posted. Obviously SLI works, just the newer drivers remove 75% of performance thats all. Shame on him for not thinking. nVidia is full of shit, SLI gave 113% performance increase right from the get go.

lol

z32?
 
Amen brother, amen! Let's face it - if you don't buy two cards at the same time, you're prolly not going to get a second one (some exceptions, of course - so don't the fanboys start pissing their pants).

You buy a card thinking "OK, just need to sell a few more pints of blood, then I'll get a second one for the ol' $LI setup in a month or so!". And does that happen? By the time you've got the cash - the second card is a POS, because there is something faster, better, DX15 compatible out which requires a new mobo in any case. So that's why I'm not even thinking of an $LI mobo for my upgrade build this year.

Just bring out the new top performer GFX card so I can buy it and move on. :p

That's pretty much EXACTLY my thought on the subject - back before I had kids and could buy the top end toys I wouldnt have even considered it as an upgrade path because in all likely hood something new at 4x the performance would be available for nearly the same cost.

Add in the initial cost of having a mobo thats SLI capable now and it really doesnt make sense unless you crap piles of cash every time you go to the bathroom and you're going to SLI the absolute TOP card available.

Buying new computer hardware is like firing a .44 Mag handgun for the first time - you put it out there, close your eyes and pull the trigger - all the while hoping like hell it doesnt come back and smack you in the head a fraction of a second later.
 
SLI is like hyperthreading......people are so eager to spend their money on it without really knowing the extent of the performance increase they'll actually see.

Simply put, SLI/XFire is a waste of money.

-Dolphin

Let's say I get 50fps in a game at 2560x1600rez, so if I pop in a second 8800GTX I better get 100fps then to get my monies worth I just spent, if not why bother, I like eXtreme performance but not if it is a complete waste of money and does not give 100% increase in game benchmarks, and even certain games don't even take advantage of SLI at all.

Single slot solutions never give you the same percent return on higher prices either, so to insist that SLI has to is retarded. Cheapest 320 MB GTS on Newegg is 250. Cheapest GTX is 455. An over 80 percent increase in price. Does it provide an over 80% increase in performance? Not even close. More like 40%.

I guess the GTX is/was a huge waste of money. Your arguments show a distinct lack of knowledge about the video card world.
 
Hehe, wanna see a 320GTS in 1920x1200 versuz a 768GTX in real gaming :D, than we can talk who has lack of knowledge :p.
 
Is there a way to harness the hot air here for the good of mankind?

If this thread isn't tragic enough to compel those under NDA to break the agreements and spill the beans, then, well...I suppose they can stand more torture than me. Good men, possible CIA material.

On the other hand, much like Vogon poetry, this thread may force your server to self destruct for the common good. Ahh, a selfless but sad and unnecessary act...for God Sake, break the NDA and save the [H] servers!!

WHERE - IS - YOUR - HUMANITY!! (William Shatner intonations and hand gestures)
 
Is there a way to harness the hot air here for the good of mankind?

If this thread isn't tragic enough to compel those under NDA to break the agreements and spill the beans, then, well...I suppose they can stand more torture than me. Good men, possible CIA material.

On the other hand, much like Vogon poetry, this thread may force your server to self destruct for the common good. Ahh, a selfless but sad and unnecessary act...for God Sake, break the NDA and save the [H] servers!!

WHERE - IS - YOUR - HUMANITY!! (William Shatner intonations and hand gestures)

I agree NDA schem D A, just say yes there is a 8800Ultra rival on the way soon or later
 
I don't lnow about you, but I think the 8800GT is pretty high up there in performance.

And $200 or more is certainly put's the card into obsessive gamer/geek teritory, more than 90% of PC gamers won't spend more than $200 for their video cards.

I'm hoping to be able to get an 8800GT 512MB for under $200 before the end of the year. Fry's already has had one on pre-order for $229 for a little while now. $30 less after rebate, on sale, shouldn't be to hard to find sometime in the next couple months, I'm guessing.:D
 
Single slot solutions never give you the same percent return on higher prices either, so to insist that SLI has to is retarded. Cheapest 320 MB GTS on Newegg is 250. Cheapest GTX is 455. An over 80 percent increase in price. Does it provide an over 80% increase in performance? Not even close. More like 40%.

I guess the GTX is/was a huge waste of money. Your arguments show a distinct lack of knowledge about the video card world.

QFT ~ the very "high end" computer products are nearly universally poor values, almost always far less price/performance than the mid / mid-high range, with the occasional pedestrian mid/low level star product giving you equal performance of the top of the line product at 25% - 50% of the price:

Pentium 166MMX which ran at 233Mhz
Celeron 300A which ran at 450Mhz
Pentium III "flip chips" which overclocked well and were fast for their time...
Pentium 4 "Northwood" 1.6A (was it A?) which easily did 2.0Ghz
Pentium 4 2.4C -> 3.x Ghz
Athlon XP "mobile" CPUs
GeForce 4600?
Athlon 64 3200+
Opteron 165 / Athlon X2 (various models)
Core Duo CPUs ("slower" models which overclock by as much as 50% or more)
Kingston DDR2-667 memory which sold for $19 per gig stick and runs 100% solid at DDR2-800 speeds

And now, the GeForce 8800GT 512, the latest price/peformance star of the enthusiast PC world! :D

All those were the "smart" choices of PC geek/enthusiasts, at least those who were not either A) fools (who are easily parted from their money) or B) have more money that any person rightfully should :p :rolleyes:

Hehe, wanna see a 320GTS in 1920x1200 versuz a 768GTX in real gaming :D, than we can talk who has lack of knowledge :p.

I'll play against you with my 1680 x 1050 20" widescreen gaming monitor powerd by my GTS 320 any day of the week and call you punk after I'm finished with you :p Only, - I took my power hungry somewhat overpriced 8800GTS 320 back to the store right before my 30 days was up so I could get an 8800GT 512MB in a while, which, at about 1/2 the price will give me about equal peformance to your GTX.

It seems like SLI has almost never been a good value as the trend has been to release newer cards with higher performance before the price of the last generation cards comes down enough to make that old card in SLI a very attractive deal compared to just getting one of the newer cards...

But, maybe if you can get a good deal on a used 8800GTX to run in SLi you'll still get substantially better performance than my one 8800GT ~ but just wait until next summer when it sends your AC bill through the roof :p You'll be cozy warm all winter if you have that rig heating your room. :D :D :D
 
I did not say, that i own a GTX.

And still "wanna see a 320GTS in 1920x1200 versuz a 768GTX in real gaming", if you want smooth fps on that resolution with high settings (what is and can be subjective) i can't go 320GTS.

But all this don't cares. Nvidia will do what they will do. Only ATI could speed things up, but...
 
I did not say, that i own a GTX.

And still "wanna see a 320GTS in 1920x1200 versuz a 768GTX in real gaming", if you want smooth fps on that resolution with high settings (what is and can be subjective) i can't go 320GTS.

So? What does that have to do with anything? The GTX does not return 80-100 percent performance gain over the 320 GTS. Its a simple fact. High end PC parts do not return a 1 to 1 price vs performance ratio, it's practically a law of the universe.
 
So? What does that have to do with anything? The GTX does not return 80-100 percent performance gain over the 320 GTS. Its a simple fact. High end PC parts do not return a 1 to 1 price vs performance ratio, it's practically a law of the universe.

Of course it does!!!! My epenis went from 6 to 12 after I bought my 8800GTX!!!

Too bad those were pixels not inches :p
 
So? What does that have to do with anything?
I was talking about real game situations on high resolutions with high settings. And there for smooth fps (subjective) you cant use a 320GTS. You sure can (but you can also a 8600), but if you wan't to enjoy and have really smooth fps you sure go top high end, but to buy a GTX or ULTRA in end 2007 would be blulshit. Right now the GT is sure a ok card but it is time for a new high end card (the GT is not "wow" for end of 2007, it's one year from the GTX).

There will sure something come, normal. And the price will be to high :p. Let's se what nvidia will do till december/januar.
 
Of course it does!!!! My epenis went from 6 to 12 after I bought my 8800GTX!!!

Too bad those were pixels not inches :p

Ummm... Dude....

I think something's wrong with yer thing'a'ma'jigger...

My E-penis easily hit 14 from 6 after I installed my GTX, then got another 0.000004 after overclocking...

You might wanna use the "Step-Up Program" soon!

:D:D:D

-V
 
Back
Top