NV40 image

I don't care if it does sound like a leafblower, it has to work damned hard to be louder than the 9 fans that are in my PC now anyway.

If it's fast I am happy. Besides I've got headphones.
 
Originally posted by CleanSlate
the 5900 series can't hold a candle tot he 9800 series for aa/af and IQ.. so how is that equivalent?

~Adam

Yeah, I was gonna say!

You don't drop $300+ on a card and NOT use FSAA and Aniso. And the GeForce line's FSAA is just....inferior. It takes a bigger performance hit to enable, and the quality is *noticeably* less.

Granted, it has MUCH wider game compatibility, but, still.....
 
It's nice to see dual DVI on the highest end card, finally.

Not that I like LCDs.
 
idk about that pic. The card it's self looks ok,

but look at the shadowing and the refliction. the reflection is kinda wierd, it looks like it was out in there.
 
Originally posted by dderidex
Granted, it has MUCH wider game compatibility, but, still.....
MUCH is not really accurate... ATi has slightly less game compatibility, and even that is an arguable point.
 
the compatability difference goes from a small gap to a canyon when you look at openGL games though.
 
any game compatability issue that i've found with an ati card (maybe one or two games at most) has been easily fixed by switching drivers
 
Personally I hate having to switch drivers just to play a specific game. But recently nVidia is becoming just as guilty with breaking games that used to work fine with new drivers.
 
Originally posted by Merlin45
the compatability difference goes from a small gap to a canyon when you look at openGL games though.

Amen to that! The stupid radeons kept doing wack shit with my OpenGL games (Quake III, Urban Terror, Warcraft III).

This included (but not limited to:)
Quake III:
-The pixels being arranged not in a normal way (such that they fill the whole screen in a rectangular pattern), but in "bee-hive" pattern, where periodically, each pixel would be missing, drawing out hexoid shapes all over my screen.

-Huge performance hits on OpenGL games (and that's comparing to a shitty Riva TnT 16meg vs Radeon 7200 64 meg, on Cat 4.3, or the Omega drivers)

-The box the Sky textures are bound to has LARGE BLACK EDGES. Welcome to Cubeworld Quake III Arena.

-Textures would appear a shade more orange

-Textures would flash wildly at me, as if i had turned off trilinear filtering

-Particle effects gone-diddly-one.


Warcraft III in OpenGL:
-Lighting effects too bright and too green
-Sprites vanish when a lighting effect is near them
-Particle effects would draw black dots
That's just the title screen, on to the gameplay
-All the terrain would sink into a great sea of black matter, and float out of it a little.
-Everything is a shade dark and purple.
-Everything is shifting colours during scrolling
-LAAAAAAAAG
-Particle effects, yet again, gone
-Models with dynamic textures don't use them properly.

Diablo II, DirectDraw:
-Massive performance gain
-Crashes after a minute or two

Those were the only three games i had during that time. I first exchanged the cards, thinking that it was defective. It wasn't. Had to return it, and buy a geforce instead. But the performance gain in Diablo II was awesome! If only it didn't crash a minute after i left the town.
 
Originally posted by Merlin45
the compatability difference goes from a small gap to a canyon when you look at openGL games though.

Like what?
 
Originally posted by lorcani
Guys, no one here has seen any benchmarks for either the NV40 or the R420 (or, if they have, they are sworn to secrecy by an NDA). So, lets wait for the reviews to come out before we make decisions like this. You'll only have to wait until the day after the release, and they'll still be ludicrously overpriced and uncommon then. :p

Buying before reading any reviews is a costly mistake. No room for any fan-boys in this world. Who knows, maybe nVidia will totally spank ATi with the NV40.

Out of curiosity, what is your source, barubal?
one of my friends is a moderator on another site where he got this pic, i asked him if i could have it and share it with you guys
 
I like the piezo speaker in the top right corner above the molex. Anyone want to bet this thing screams like a banshee when it's overheating/underpowered?
 
its either a Sound cancellation device or a buzzer for when temps go up like on alot of mobos.

the white thing on the left thing on the heatsink is a multimedia header for I2C and video input from an external daughtercard. the daughtercard allows PVA capture. :)

NV2.jpg

NV1.jpg

thats all i got to show for now ;)
(thx MuFu)
all from b3d forums.
------------
btw: sign up now.
http://www.ati.com/
ati's secret weapon :D :D :D

and yes, i hate fanboys. i support both companies.
 
Damn that thing looks like the deck of an aircraft carrier. Just thought my GF4 cut case ventilation in half. :eek:
 
So isnt there gonna be any CRT support for the card, because AFAIK all the gamers who will buy the card use CRT monitors.
 
Originally posted by aces170
So isnt there gonna be any CRT support for the card, because AFAIK all the gamers who will buy the card use CRT monitors.

It will come with a DVI to VGA Adapter.
 
Now if only the 2nd DVI could be split to 2 analog outputs like the Parhelia... surround... *drool*
 
Originally posted by dderidex
Yeah, I was gonna say!

You don't drop $300+ on a card and NOT use FSAA and Aniso. And the GeForce line's FSAA is just....inferior. It takes a bigger performance hit to enable, and the quality is *noticeably* less.

Granted, it has MUCH wider game compatibility, but, still.....
Well for one, AA/AF algorithms can change between architectures so who knows if the NV40 uses the same one (hope not). Second, NV's AF has been superior to ATI's for a while now, but that little note always seems to get overlooked.
 
Originally posted by M4d-K10wN
Are you retarded?

fuck you buddy, i've had it with your trolling. some mod PLEASE ban this asshat, just look at his post history.

for your fucking information, i have _NEVER_ had a problem with OpenGL games and my Radeon. So stop talking out of your ass.
 
Hmm...If I buy this i'm stuck with the decision of buying a new PSU or buying a second PSU as a dedicated one for the card. Probably a new PSU, don't have the time to be bothered with setting it up to turn on with the rest of the computer.
 
Nvidia's AF is not superior, it is more accurate. Being more accurate alone does not make it superior though.

ATi's AF allows for further levels of AF will less performance hit, 16X is standard but I have heard that it goes all the way to 256X (but ATi will not let users select this setting as it would really slow things down)

This is an approximation: ATi at 16x is just as accurate as Nvidia at 8x at certain angles (like 45 and 90 degrees) at other angles, ATi is better. However, Ati is faster even with the greater depth.

ATi at 8x compared to Nvidia at 8x, Nvidia is more accurate at all angles, ATi is only accurate at some angles.
 
The NV40 is superior in technology in a couple ways for sure:

With FP32 support. It is also superior with Pixel and Vertex 3.0 support.

By contrast ATi only has FP24 and pixel and vertex 2.0 (with a few 3.0 thrown in)

I would think that an image rendered with Nvidia's "might use" hybrid FP32 for closeup textures with FP16 for faraway, and maybe the sky at INT12 would look better and be much faster than a full screen FP24 that DX9 and ATi use.

There is no need for a texture that might be the rendered equivalent of a few miles away to be rendered in FP24. At that range INT12 should be adequate, and would probably boost peformance quite a bit. Its the same idea as using less polygons on character models that are only a few pixels high, really far away.

As for how long it will take to see these games... Well, honestly it might take a while. UT2004 is still pretty well DX7. Pixel and vertex 3.0 has promise, but most videocards still have the vast majority of die space dedicated to speedy fixed function calculations.
 
Back
Top