First NV40 3D Mark Score???

cornelious0_0

[H]F Junkie
Joined
Apr 6, 2003
Messages
12,783

cornelious0_0

[H]F Junkie
Joined
Apr 6, 2003
Messages
12,783
Originally posted by Global_Inferno
Still, don't expect me to make my mind up until I see the R420!

Exactly, neither am I. I'm going to be buying one of them and ditching my XT so I'll most likely end up waiting until we've got both X800's to compare to the 6800.....otherwise I'll just end up kicking myself later. ;) :p
 

Balderdash

Gawd
Joined
Jun 25, 2003
Messages
744
That's not proof of anything - it doesn't show the resolution: I can get scores in the 11,000 range now without even overclocking by lowering the resolution to its lowest setting and forcing the lowest mip-map level...

On the other hand, it could very well be a real score, but my point is, the picture there doesn't prove anything, one way or the other.
 

M4d-K10wN

[H]ard|Gawd
Joined
Mar 7, 2004
Messages
1,373
You could also replace all the textures with texture files containing a dot for a texture, to make it not render anything, and get insane framerates.
 
F

fallguy

Guest
I also doubt they are approved drivers. And you can photochop anything...
 

Merlin45

[H]ard|Gawd
Joined
Oct 6, 2003
Messages
1,072
well, as the only approved drivers are 52.16 which I guarantee don't support the nv40, of course they aren't using approved drivers.
 

tracerwilly

Limp Gawd
Joined
Jan 15, 2004
Messages
136
uhh.. ya it does show the res in that pic. it's right in the middle and shows 1024 x 768 which is default. those scores are STUPIFIED. In reading the new May issue of MAXIMUM PC where they have an article on the 6800, it looks like GeForce might pull a fast one and battle to the top of the heap again because even tho the R420 has 16 rendering pipelines it does not support Pixel Shader 3 technology. When is [H] gonna get one to test. hehe.
 

pbXassassinX1524

[H]ard|Gawd
Joined
Nov 24, 2003
Messages
1,295
Originally posted by cornelious0_0
Exactly, neither am I. I'm going to be buying one of them and ditching my XT so I'll most likely end up waiting until we've got both X800's to compare to the 6800.....otherwise I'll just end up kicking myself later. ;) :p
Wow man, talk about being a front runner. Going from a 9800XT to the next gen top runner... It's going to be like a whole new world to me from my little FX5200.
 

dderidex

Supreme [H]ardness
Joined
Oct 31, 2001
Messages
6,328
tho the R420 has 16 rendering pipelines it does not support Pixel Shader 3 technology

Yeah, but nothing does yet.

As in, software side.

There is nothing currently out that supports PS 3.0, there is nothing currently in development that supports PS 3.0. Granted, late in development that capability *could* be added to a game, but given that most games have roughly a 5-year development cycle.....

I don't think we have to worry about this difference anytime soon.

In fact, I don't even think nVidia's Cg can *create* PS 3.0 shader code yet. IOW, unless you like writing in raw assembly, there is no way to even *generate* PS 3.0 code yet.

(Granted, I *am* keeping my eye on the nVidia product and may well end up snagging one this summer....maybe. Tools to develop with it WILL be out pretty soon. The flow control PS 3.0 brings would be fun to play around with, if nothing else. And who likes to play around with that kind of thing using the reference rasterizer?)

In the end, assuming roughly equal performance, this difference WILL mean this will be the developer's card of choice this summer....but consumers certainly shouldn't care. Yet.
 
F

fallguy

Guest
Originally posted by Merlin45
well, as the only approved drivers are 52.16 which I guarantee don't support the nv40, of course they aren't using approved drivers.

I know that. I was implying that drivers other drivers inflate the score. Meaning, its not very accurate.
 

Roach

Limp Gawd
Joined
Apr 30, 2003
Messages
467
Yeah the drivers could be causing an inflated score, just like that incident a few months back. :rolleyes:
 
Joined
Jun 26, 2000
Messages
893
Pixel Shader 3.0 is not a feature that will influence my buying decision. It's only supposed to be a evolutionary step up from PS2.0 not a revolutionary upgrade like PS2.0 was to 1.0. What will determine my buying decision is speed, and image quality in games.
 

cornelious0_0

[H]F Junkie
Joined
Apr 6, 2003
Messages
12,783
Wow, how did I know that there'd be this amount of negative uprising against this post??? :p

Come on people, none of you (or myself) are in the position to judge the card or the company on their upcoming hardware.....not until it's no longer upcoming. I'm not saying that I completely believe everything that I see online about the card(s) but it is still quite exciting.

I have a funny feeling that if the card DOES turn out to perform as some of these "results" are showing there'll be quite a few people with their heads hung. ;)

It makes no sense to start pre-judging things and say things like "Yeah, well.....it's fake" or "Yeah, well they're probly using dots for textures" or something of that nature. You are in no position to say what it IS, so you can't say what it ISN'T either.

Without making any enemies over this I just wanna say that we should just sit and wait.....to see what comes of this. The exact same thigns were being said when the 9800XT and other previous "next gen" cards launched. Some results were positive, and some were negative. Let's wait until the card is released before we decide which is which.
 

Bad_Boy

2[H]4U
Joined
Dec 17, 2003
Messages
3,282
im just wondering why everybody thinks ps/vs 3.0 isnt good for anything.

ok lets say you get a r420, and then theres ps2.0/vs2.0 ...and when ps3.0/ps3.0 games start to pump out...what will you think then?

if far cry is already adding ps3.0 support im pretty sure the games to follow like doom3 and hl2 will have it also.

microsoft seems to think it will do a lot of help, hence the "Shader Model 3.0 - No Limits" title.
http://www.microsoft.com/whdc/winhec/partners/shadermodel30_NVIDIA.mspx

PixelShader3.0 requires 32bit, so NVidia won't throw 32bit out of the window. ATI will stick with 24bit, which is potentially faster. The disadvantage is that 24bit is not enough for PS3.0.
 

cuber

Limp Gawd
Joined
Apr 7, 2004
Messages
165
Originally posted by cornelious0_0
Wow, how did I know that there'd be this amount of negative uprising against this post??? :p

Come on people, none of you (or myself) are in the position to judge the card or the company on their upcoming hardware.....not until it's no longer upcoming. I'm not saying that I completely believe everything that I see online about the card(s) but it is still quite exciting.

I have a funny feeling that if the card DOES turn out to perform as some of these "results" are showing there'll be quite a few people with their heads hung. ;)

It makes no sense to start pre-judging things and say things like "Yeah, well.....it's fake" or "Yeah, well they're probly using dots for textures" or something of that nature. You are in no position to say what it IS, so you can't say what it ISN'T either.

Without making any enemies over this I just wanna say that we should just sit and wait.....to see what comes of this. The exact same thigns were being said when the 9800XT and other previous "next gen" cards launched. Some results were positive, and some were negative. Let's wait until the card is released before we decide which is which.
your right but isnt it cool how ppl response cause it gets me more exited to see benchmarks and stuff... even though i dont buy vcard over $200 cause inm a cheap bastard:)
 

cornelious0_0

[H]F Junkie
Joined
Apr 6, 2003
Messages
12,783
Originally posted by cuber
your right but isnt it cool how ppl response cause it gets me more exited to see benchmarks and stuff... even though i dont buy vcard over $200 cause inm a cheap bastard:)

I feel sorry for you dude, you're missing out on all the fun. ;)

if far cry is already adding ps3.0 support im pretty sure the games to follow like doom3 and hl2 will have it also.

I don't think that it's relevant that it has FUTURE support for it right NOW, 'cus by then we won't be thinking about little old Far Cry, we'll be on to bigger and better things.

Having support for something and actually using it are two completely different things. Just because the capabilities are being coded into the games, that doesn't mean that it won't go unused.

Oh, and just in case you're wondering, Doom III is OpenGL and does not use pixel or vertex shaders.....that's DX you're talking about.....two different things. ;)
 

cuber

Limp Gawd
Joined
Apr 7, 2004
Messages
165
yeah i guess but i just dont think its worth getting a high end vcard for my $500 machine
 

Bad_Boy

2[H]4U
Joined
Dec 17, 2003
Messages
3,282
Originally posted by cornelious0_0

Oh, and just in case you're wondering, Doom III is OpenGL and does not use pixel or vertex shaders.....that's DX you're talking about.....two different things. ;)

lol you got me there.

i guess is what im tryin to say, is that if i pay for a 500 dollar card, i want every feature i can get that will last me a long time.
 

leukotriene

2[H]4U
Joined
Sep 18, 2002
Messages
2,664
Far Cry does support PS3.0.

It was patched in very recently and does not produce any performance boost so far as anyone can see. The primary differences between PS2.0 and PS3.0 are minimum and maximum shader length plus branching support, all of which seem unlikely to increase performance by much. It will allow for neater visual effects, but unless the game is written for PS3.0 initially, I doubt you'll see those either.
Basically, as soon as most games start to use PS2.0 to the full (FarCry uses PS1.1 and PS2.0) then I'd worry about PS3.0. For the NV40 and R420, it's really going to be about speed and image quality in PS2.0 games and the many lingering DX8/8.1 PS1.1/1.4 games.

The R420 is rumored to support VS3.0 but only PS2.0+ by the way.

As for approved versus unapproved drivers, there's a reason for approval or disapproval, that being application detection and shader substitution.

I saw over at theinquirer that they had revised their NV40 3DMark 03 score down from 20,000 to 12,300 or so, which is very close to the score reported in the link in this thread.
 

cornelious0_0

[H]F Junkie
Joined
Apr 6, 2003
Messages
12,783
yeah i guess but i just dont think its worth getting a high end vcard for my $500 machine

Yeah, I guess that's fair. ;) :cool:

guess is what im tryin to say, is that if i pay for a 500 dollar card, i want every feature i can get that will last me a long time.

I know where you're coming from but unfortunately with the way the market is.....that's never TRUELY going to happen. Not for quite awhile anyways. I dont have a grudge against Nvidia or anythign but as long as their are companies out there like them that are trying new things (which i actually think is a bold move).....we'll never have hardware that supports EVERYTHING.

I know how nice it would be, but we're close. ;)
 

cornelious0_0

[H]F Junkie
Joined
Apr 6, 2003
Messages
12,783
Originally posted by zoltan
notice the picture blocks the name of the video card on 3dmark03

Alright, so then what If I told you it was the r420? Would it change anything? Not really.

The score is what it is.....we'll find out the details later on when we see if it matches up with what Nvidia has to say at the launch on Tuesday!!! :D

I REALLY hope they let you watch the launch streaming off their site like they did with the nv30. Hmm, I wonder if you can still go watch that thing. They left it on the site for quite awhile.....wonder if it's still there. ;)

EDIT: off topic but...........this is kinda cool also, just saw it on the way to go check for that video. I really kinda like how nvidia is so much more "all over" in different markets. It's interesting.

EDIT: Bah, I can't find it. :rolleyes:

Heh, 4444 posts. ;)
 

dderidex

Supreme [H]ardness
Joined
Oct 31, 2001
Messages
6,328
I really kinda like how nvidia is so much more "all over" in different markets. It's interesting.

Yeah, well, I'd like to see them "all over" PC multimedia.

As it is, their lack of a real competitor to ATIs "All-In-Wonder" line is pushing me back to ATI.

Okay, so they have their crap 'personal cinema', but it really is crap!

1) It's only available on their budget cards. Oh, sure nVidia has a product *listed* for a 5900 personal cinema (not an Ultra, and not a 5950, though)....but no partner is MAKING one! The best you can buy is a 5600 (non-ultra) personal cinema card. Bleh. ATI always has their top-end card out with an 'All-In-Wonder' varient.

2) Several key missing features. First off, what's with the 12 different third-party applications you get to use to DO anything? Couldn't nVidia at least find SOME way to integrate all that? (Granted, ATI isn't a *whole* lot better - they just slop everything together on a single taskbar, but at least they all *look* the same and have similar controls)

And, more interesting, not being able to stream across a network. This is a fairly new feature for ATI (they call "Eazyshare"), but it is REALLY NICE.
 

Bad_Boy

2[H]4U
Joined
Dec 17, 2003
Messages
3,282
Originally posted by leukotriene
It was patched in very recently and does not produce any performance boost so far as anyone can see.

uhh.
BECAUSE THERE ARE NO PS3.0 CARDS OUT! ;)

Originally posted by cornelious0_0
I dont have a grudge against Nvidia or anythign but as long as their are companies out there like them that are trying new things (which i actually think is a bold move).....we'll never have hardware that supports EVERYTHING.

i ment if that ps3.0 is out, id rather have it over ps2.0 just because it would last longer. ps2.0 will be outdated.
so if there is a feature out right now that the other card doesnt have, ill probally pick it saying the cards were equal.

theres no possible way i could get everything i want into one card ;)
 

SnowBeast

[H]ard|Gawd
Joined
Aug 3, 2003
Messages
1,312
Originally posted by cornelious0_0


Oh, and just in case you're wondering, Doom III is OpenGL and does not use pixel or vertex shaders.....that's DX you're talking about.....two different things. ;)

WRONG!! Might want to look at Carmacks Plan updates from a few years ago when the Geforce 3 came out. He showed off Pixel shading with the lighting and shadows off of fans and characters that he is implamenting in Doom 3. OpenGL still has Pixel Shading, just not written in the same DirectX code path. Basically, OpenGL has both, just different ways of implamenting it.;)
 

Merlin45

[H]ard|Gawd
Joined
Oct 6, 2003
Messages
1,072
but he only uses first generation pixel and vertex shaders, the ogl equivilants to sm 1.1
 

Global_Inferno

Limp Gawd
Joined
Aug 31, 2003
Messages
238
I didn't start this thread so that it could turn into a huge argument.. Simply a "news" post.

I am sceptical myself as to whether or not the results are valid. However, I don't see anything more "solid" proving NV40 scores/ performance!
 
Joined
Jan 27, 2004
Messages
898
Originally posted by SnowBeast
WRONG!! Might want to look at Carmacks Plan updates from a few years ago when the Geforce 3 came out. He showed off Pixel shading with the lighting and shadows off of fans and characters that he is implamenting in Doom 3. OpenGL still has Pixel Shading, just not written in the same DirectX code path. Basically, OpenGL has both, just different ways of implamenting it.;)

Pixel shaders are supported in OGL through extensions. However, D3 places heavy emphasis on stencil shading, which is probabley what you got it mixed up with.
 

cornelious0_0

[H]F Junkie
Joined
Apr 6, 2003
Messages
12,783
Well what I was getting at is that they aren't EXACTLY the same, and that a lot of people think that everything's the same.....and that DoomIII is built on DX or something. :rolleyes:

Whatever, point is it looks cool. :cool:
 

df12

Limp Gawd
Joined
Sep 7, 2000
Messages
198
Originally posted by cornelious0_0
Well what I was getting at is that they aren't EXACTLY the same, and that a lot of people think that everything's the same.....and that DoomIII is built on DX or something. :rolleyes:

Whatever, point is it looks cool. :cool:

This quote from Carmack seems to imply that Doom3 IS a DX9 game:
John Carmack
Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.
 

cornelious0_0

[H]F Junkie
Joined
Apr 6, 2003
Messages
12,783
Originally posted by df12
This quote from Carmack seems to imply that Doom3 IS a DX9 game:

Umm, not really.....not if you actually read it. I guess it might matter what angle you're coming at this from but it doesn't sound like he's saying Doom is DX9 at all.
 

df12

Limp Gawd
Joined
Sep 7, 2000
Messages
198
I guess, it's all in how you want to spin it personally. But at the very least it could be the reason for Doom3+DX9 speculations...
 
Top