What is a w buffer and why cant my geforce 4 4400 do it?

LordJezo

Limp Gawd
Joined
Jan 18, 2003
Messages
471
I see an option in Beyond Good and Evil graphics settings called a "w buffer" but enabling that makes everything break.

So..

What is this "w buffer" option and why can't I use it, and, what kind of cards can?
 
w-buffer's are usually associated with fog

i don't recall if the GF4 has a w-buffer or not, if it breaks the game when you turn it on then don't turn it on
 
What is the W-BUFFER video option for?

OFP version 1.30 introduced a new option setting, W-BUFFER. This is for systems that have an Nvidia based graphics adapter and are running OFP in 16 bit mode.

The W-BUFFER setting is forced to enabled and cannot be disabled for Geforce3 graphics adapters, which always run in 32 bit mode, and for Voodoo 3 cards running in Glide 16-bit resolution.

So, what does W-BUFFER do? Here's a quote from Rage3D.Com:

The W-Buffer is another way to determine the depth of pixels in a scene. (for the techies, the Z-Buffer stores the actual depth of each pixel, the W-Buffer interpolates 1/w or 1/z, then (1/(1/w or z)) is performed per pixel, which then is stored in the depth buffer.) The advantage of the W-Buffer is that it distributes the depth values evenly for a 3D scene, whereas the the Z-Buffer uses most of its precision for objects close to the viewport, leaving fewer values available for objects deeper in the 3D scene. This more uniform distribution of depth precision can help avoid pixel/polygon popping within 3D scenes but the application has to support the W-Buffer for this setting to matter.

http://www.theavonlady.org/theofpfaq/install/wbuffer.htm

I thought it was first in the R300... oh well...
 
So that says my card supports it.. considering a geforce 3 could do it.

I wonder why that specific option breaks my game.

Well, off to the ubisoft message boards!
 
Back
Top