A thread about G92 in 2019?! Yes...

G92 is a classic for sure. Aged incredibly well, especially the rare 1GB variants.

Ran SLI 8800GTs with my Athlon X2 Black Edition back in the day, absolutely smashed every game out at the time- except for Crysis lol

years later I ran Crysis 2 on a single 8800GT and was very surprised to find it worked fine at 1080P. (turns out Crysis 1 was just poorly-optimized this whole time?) Great chip. I still have a G92-based Quadro FX3700 in the parts bin but it doesn't seem to work anymore :cry:

Crysis 2 had reduced visual fidelity compared to the original FYI.
 
Crysis 2 had reduced visual fidelity compared to the original FYI.
Correct. It was purposefully smaller in scope so that the then current consoles would be able to cope with the workload. Framerates were still kind of shit on XBox 360 and PS3, with the latter really chugging because of a lack of bandwidth on RSX. There was a lot of furor about it back then that all we got at launch was a DX9 port. A DX11 patch was later added that was heavily tilted in nVIDIA's favor with insanely over tessellated water that would make the 5900 series Radeons choke (AMD's tessellation units at the time were weaker by comparison, whereas Fermi was a tessellation powerhouse). I would eventually like to get DVI capture and run some FCAT tools to see exactly how this rig holds up. Do it right for you guys. But until I get a different quad core CPU this is on hold for now. Keep the good comments rolling though!
 
Correct. It was purposefully smaller in scope so that the then current consoles would be able to cope with the workload. Framerates were still kind of shit on XBox 360 and PS3, with the latter really chugging because of a lack of bandwidth on RSX. There was a lot of furor about it back then that all we got at launch was a DX9 port. A DX11 patch was later added that was heavily tilted in nVIDIA's favor with insanely over tessellated water that would make the 5900 series Radeons choke (AMD's tessellation units at the time were weaker by comparison, whereas Fermi was a tessellation powerhouse). I would eventually like to get DVI capture and run some FCAT tools to see exactly how this rig holds up. Do it right for you guys. But until I get a different quad core CPU this is on hold for now. Keep the good comments rolling though!

The "over-tesselation" of Crysis 2 was FUD....fanboys not understanding how tesselation works (most of it was culled pre-render)...but who cares about facts when your favourite vendor is performing less than the competition...FYI ;)
The rest was spot on though.

EDIT:
Link to back up my claim - https://hardforum.com/threads/no-amd-300-series-review-samples.1865193/page-7#post-1041667964
 
Apparently I didn't get the memo about the tessellation thing being nothing but FUD. A line perpetuated by AMD fanboys butthurt because their cards couldn't tesselate as well, not due to a conspiracy or bad programming, huh? I could have sworn I have read an article about it somewhere as well, so it wasn't just the fanboys. I'll vet what I post more closely in the future.

For the record I usually get what has the feature set I need at whatever budget I have at the time.

The "over-tesselation" of Crysis 2 was FUD....fanboys not understanding how tesselation works (most of it was culled pre-render)...but who cares about facts when your favourite vendor is performing less than the competition...FYI ;)
The rest was spot on though.

EDIT:
Link to back up my claim - https://hardforum.com/threads/no-amd-300-series-review-samples.1865193/page-7#post-1041667964
 
Last edited:
I still have my 3-way SLI bridge. Always brings back memories when I see it

Edit: I also had a tri-sli 8800GTX setup.

i had the unfortunate problem of not having fingers that lined up for 3 way sli hence the 3 flexible bridges.

also when i did it i had to find a diagram of how the 3 way bridges were wired so i could copy it
 
i had the unfortunate problem of not having fingers that lined up for 3 way sli hence the 3 flexible bridges.

also when i did it i had to find a diagram of how the 3 way bridges were wired so i could copy it

I know some boards didn't have the GPUs staggered properly.
 
I remember getting a new 8800 GTX and only getting 30 fps in Oblivion at 1280x1024 lol. Outside of Crysis and Oblivion the card was a beast.

Anything above 30 was amazing at the time. I remember my X1900XT getting me like 35-40 at 1280x1024... and it was glorious. I could even throw on a few mods without performance tanking too hard.
 
I remember gaming at 1280x1024 on my old viewsonic crt. I think it was near the end of running my 8800gt sli combo that I finally replaced it with a 23" lcd @1080p. I definitey missed the smoothness of the crt up until I finally got a 144hz monitor.
 
I'm having the unfortunate problem of the fan on the top card spooling up to nearly 100% after about 3-5 minutes of gameplay. No dust in the card and fresh thermal paste, too. It's so loud its obnoxious and drowns out the game I'm playing. I had a single 8800GT back in the day and I don't remember them getting this loud.
 
So what about the card that's actually responsible for bringing G92's into it's prime?

Oh yeah, BRING IT!

View attachment 156399

I picked up one of these bad boys on launch week for $199! Nvidia was so embarrassed they had ti drop their prices $75 to $100 overnight ( in addition, had to overclock both chips 10% to make them more competitive, and relabel them as the 9800GT/GTX+).

View attachment 156400

People forget, the 9800 GTX launched at $300 in April, and the 8800 GT was still $220 at the time.

And those price-drops were across-the-board. The 260 was embarassed by the 4870, and had to drop to $300. They then had to make a special Core 216 to actually make it competitive.
This is like comparing Vega 56 to the GTX980. They were a generation apart. Nvidias GTX260 even came out before the 4850/4870 by a few weeks. So the older G92 is not a proper comparison to 4850, but rather the lame 3870 vs G92.
 
Back
Top