$BangforThe$
[H]ard|Gawd
- Joined
- Nov 5, 2005
- Messages
- 1,672
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
$BangforThe$ said:It has actually been talked about many times its just that this new benchmark is shedding more light on the subject. Almost all the reviews of nv cards talk about it as having 16 bit only ( I actually believed it was 24 bit). Its probably the reason for the shimmering affect maybe don't know don't care. I buy cards based on picture quality to me thats the big deal . Many go for speed. To me a graphics card should be judged on picture quality first than everthing else.
LOL. The truth has to be distorted to match Fuad the moron's postings.vanilla_guerilla said:amazing that no one has mentioned this before.
pxc said:LOL. The truth has to be distorted to match Fuad the moron's postings.
And why does $bang have to spam every forum with theinq links?
Terra said:So no one noticed this...until now?
(And still don't?)
Not even Futuremark under their driver validations?
And how did NVIDIA get WHQL drivers from Microsoft then?
Terra - This one is a dud...
ryan_975 said:well nVidia is in bed with all those listed, so they just let it slide.
j/k
I have a better question: why do you need to ask questions about Fuad's article? I'll tell you why... Fuad is an ignorant moron. He has no clue what he writes about. There may or may not be a problem, but Fuad mixes up even simple things like internal precision vs texture formats.MartinX said:If you need the 24/32 bit thingys to do normal maps, and the nv stuff only supports 16bit thingys, how have nvidia cards been doing normal maps for the last few years?
pxc said:I have a better question: why do you need to ask questions about Fuad's article? I'll tell you why... Fuad is an ignorant moron. He has no clue what he writes about. There may or may not be a problem, but Fuad mixes up even simple things like internal precision vs texture formats.
(Edit) The fine folks at B3D are having a laugh at how stupid Fuad is, bringing up the same points as above: http://www.beyond3d.com/forum/showthread.php?t=31786
FWIW, supplying higher than the minimum requested precision (24-bit, for example) is allowed by the DX spec. Remember the mixed mode path in HL2? It gained speed by running some shaders in 16-bit precision (instead of the non-nv-supported 24-bit precision), otherwise by default it would run in 32-bit precision at a huge performance hit.
EVIL-SCOTSMAN said:Its isnt just that Fuad nub who is stupid at Inquirer, I have come to the uber leet conclusion that most of the writers at the inquirer havent got a fuxing clue as to what they are writing.
Maybe thats why it is just one huge laughable website, cuz all there writers are retarded and got weird names that sound as though they live in the desert and havent even got electricity nevermind a fuxing computer.
Inquirer is the biggest bullshitting "tech" website there is, closely followed by uncle TOM.
ryan_975 said:Well, his name is FUaD. take out the ''a' and you have FUD....hmmm.
EVIL-SCOTSMAN said:Dude great minds think alike, I was gunna call him Fud, but I dont know if it has the same meaning in the US as it does in the Uk, in the Uk it is the part of the female body that we spend the first 9 months of our lives trying to get out of, and the rest of our lives trying to get back into.
I dunno if it means the same in the states, but whatever, he is still a Fud.
Depends how you look at it. To be DX9 Compliant hardware needs to be able to run at 24bit Precision or higher (High Quality Cinematic Computing) wether or not YOU CAN run at 16bit is not what Fuad is saying. He's just trying to play by Microsofts rules.pxc said:I have a better question: why do you need to ask questions about Fuad's article? I'll tell you why... Fuad is an ignorant moron. He has no clue what he writes about. There may or may not be a problem, but Fuad mixes up even simple things like internal precision vs texture formats.
(Edit) The fine folks at B3D are having a laugh at how stupid Fuad is, bringing up the same points as above: http://www.beyond3d.com/forum/showthread.php?t=31786
FWIW, supplying higher than the minimum requested precision (24-bit, for example) is allowed by the DX spec. Remember the mixed mode path in HL2? It gained speed by running some shaders in 16-bit precision (instead of the non-nv-supported 24-bit precision), otherwise by default it would run in 32-bit precision at a huge performance hit.
No, it doesn't "depend" how you look at it. Standard shader precision is 24-bit. Texture formats are different and various depths and formats are allowed depending on the operation and framebuffer format. Fuad doesn't have a clue what he's writing about.ElMoIsEviL said:Depends how you look at it. To be DX9 Compliant hardware needs to be able to run at 24bit Precision or higher (High Quality Cinematic Computing) wether or not YOU CAN run at 16bit is not what Fuad is saying. He's just trying to play by Microsofts rules.
I think it is pretty wrong considering when the 6 series came out everything would run in FP32 and only in FP16 if you specified it.$BangforThe$ said:
Banko said:I think it is pretty wrong considering when the 6 series came out everything would run in FP32 and only in FP16 if you specified it.
So I take it that it's BS?Brent_Justice said:I giggled
pxc said:I have a better question: why do you need to ask questions about Fuad's article? I'll tell you why... Fuad is an ignorant moron. He has no clue what he writes about. There may or may not be a problem, but Fuad mixes up even simple things like internal precision vs texture formats.
(Edit) The fine folks at B3D are having a laugh at how stupid Fuad is, bringing up the same points as above: http://www.beyond3d.com/forum/showthread.php?t=31786
I see fanboys of all types at B3D. But it's a pretty consistent opinion that Fuad doesn't know what he's talking about.Sharky974 said:B3D is full of raging Nvidiabots these days, like "Vysez", so that means nothing.
He spammed the forum with inquirer links?coldpower27 said:DX9 allows partial precision of 16Bit when there isn't a need for the 24Bit "Full' Precsion or Shader Model 3.0 32 Bit Precision. We will have to see if there is a image quality degradation before we starting accussing Nvidia of anything.
On another note Hmm, why is bang banned?
I thought we were talking about shader precision. I'm no ATi partisan (I don't see where I mentioned Microsoft DX9 specs requiring 32bit texture precision). Currently I preffer ATi's x1K over nVIDIA's 7x00 series, but then again I always preffer what is more advanced (technology wise). So if we're not talking about Shader precision (which I believed is what Fuad was saying) then why the hell is this even news?pxc said:No, it doesn't "depend" how you look at it. Standard shader precision is 24-bit. Texture formats are different and various depths and formats are allowed depending on the operation and framebuffer format. Fuad doesn't have a clue what he's writing about.
It takes a lot more work to try to twist around what he wrote into something that resembles reality than it's worth. But as an ATI partisan, you look to seize on anything you can bash nvidia with, even if it makes no sense. I read your post on the B3D thread.
Sable said:But then take a look at this thread on anandtech where Gstanfor shows the picture has been photoshopped. you can even see it in the original images around the edges of the water.
OWNED!!!!!!!!Terra said:*ROFL*
That's one way of giving your benchmark a piss-poor rep, before it's even out
Terra...