Nvidia gets official endorsement of Doom3.

no, carmack likes money, and i'm sure ID has been paid by nV. ATi is clearly better, money is the only explanation.

;)
 
ahh c'mon... nvidia could make a huge comeback, they've beat ATi all this time accept with the FX series.

Hopefully they will realize the error of their ways.
 
This is crap.

I am not a fanboy of either brand. I just ordered my first ATI card today, and I can't wait to try it.

But! I absolutely hate it when game companies get behind non-standardized hardware. It pisses me off to no extent. I do not want to see the day where a game only runs on "An nVidia Platform" or an "ATI Platform".

I will stop gaming when that happens, and it may happen soon :(

Oh, and a quote from the linked article. Someone said it better than I could've:

Why can't game companies just make games, and graphics card companies just make graphics cards. Wasn't there some sort of bridge building game where you could only play it on Nvidia cards? This whole thing is starting to sicken me.
 
Originally posted by pbXassassinX1524
ahh c'mon... nvidia could make a huge comeback, they've beat ATi all this time accept with the FX series.

Hopefully they will realize the error of their ways.

i dont intend to be biased at all with the next round of cards.

if nVidia puts out a card that hands ATi it's ass on a plate, then i'll be going nVidia. But nvidias got a magic trick to pull off and i'll believe it when I see it.
 
killa, when Carmack tested the NV30 vs the R300, the NV30 was faster. NV has gone so far to even make a Doom3 specific feature: UltraShadow (of course it can also be used by other games if the developers choose). It's no surprise that iD gave an endorsement, it was the same position that Carmack stated last year.
 
Originally posted by pxc
killa, when Carmack tested the NV30 vs the R300, the NV30 was faster. NV has gone so far to even make a Doom3 specific feature: UltraShadow (of course it can also be used by other games if the developers choose). It's no surprise that iD gave an endorsement, it was the same position that Carmack stated last year.

Not exactly an apples to apples comparison, though, using the NV30 code path vs. the standard ARB one. But, that said, Doom 3 is one of those games that's going to be impossible to benchmark current graphics hardware under true apples to apples IQ settings.
 
Originally posted by Killa|3yte
i dont intend to be biased at all with the next round of cards.

if nVidia puts out a card that hands ATi it's ass on a plate, then i'll be going nVidia. But nvidias got a magic trick to pull off and i'll believe it when I see it.
Yea, I fully agree, and would have no problem buying Ati if their card curb stomps nvidia again. But the driver suites that nvidia seem very good to me right now. (though I've never had experience with ATi)
 
Originally posted by John Reynolds
Not exactly an apples to apples comparison, though, using the NV30 code path vs. the standard ARB one. But, that said, Doom 3 is one of those games that's going to be impossible to benchmark current graphics hardware under true apples to apples IQ settings.

Just what I was going to say. It needs its own path to be faster. When run on the standard ARB path, the R3x is much faster.
 
It needs its own path to be faster. When run on the standard ARB path, the R3x is much faster

Well, the only caveat I'd add to that comes from OGL research.

Remember that OGL using the base language is still a very much DX7-generation API.

ARB is not really 'standard', it's an extension added to the API to allow DX8 and 9 level effects.

But it IS an extension.

Just like all the NV_ extensions add DX8 and DX9 level effects, too.

You could just as easily argue that coding for ARB extensions is just as much 'coding for ATI' as coding for NV_ extensions is 'coding for nVidia'. Just using base OpenGL with none of the shader extensions gets you...well, no shaders at all. You HAVE to use extensions to get any shaders out of it, and different vendors prefer different extensions.

The only difference being that ATI doesn't put their name or abbreviations in the extensions they prefer, and it has wider support with other vendors.

Hopefully, this will all be resolved in OGL 2.0.

One could almost take this a step further in the DX world and say that nVidia actually follows the spec more closely. It requires a min of FP24, as many here keep pointing out, but it does include capability for partial precision to increase speed. nVidia does this, ATI doesn't.

Just posting a 'from the other side of the fence' position. It's hardly worth arguing that ATI is undisputably faster at DX anyway, but as someone programming in 3d APIs, nVidia's partial precision support is....interesting. I look forward to playing with it some.

Put another way....imagine what ATI could do (performance-wise) if they had adopted the partial precision support mode. They kick ass running FP24 all the time....what if they could drop to FP12 when only partial precision is needed, like nVidia does? Imagine the performance on their cards, then!
 
Originally posted by fallguy
Just what I was going to say. It needs its own path to be faster. When run on the standard ARB path, the R3x is much faster.

I was going to say the same. And the path for the NV30 is using lower quality settings too. But Carmack said it doesn't effect the game visually.

Then I think it's not fair that they didn't make a path for ATI using the same tricks if the visuals didn't change, then who knows how fast ATI cards would be.

Then the same thing happened with HL2 where Nvidia cards need a special path mixing DX8 and DX9 and lower quality settings to perform as good as ATI cards running at all DX9, and this time there is a difference visually.

Though, Carmack said that those shortcuts in quality won't work in future DX9 games without causing visual problems due to the complexity.

I hear a lot of talk that the reason the R300 from ATI was so good was because they focused on the DX stuff, not trying to create some non-standerized stuff like the FX. The games I heard that were going to use the FX was Gunmetal and Stalker. Well, did Gunmetal (I hope I have the name right) have the promissed motions blurs on jets and stuff for nvidia cards?
 
Originally posted by Killa|3yte
no, carmack likes money, and i'm sure ID has been paid by nV. ATi is clearly better, money is the only explanation.

;)

And how is ATI "clearly better" in D3, since you've never seen the finished product run on an ATI card?

There's little doubt ID received some cash from nVidia. It's no different than the $6 million + Valve received from ATI.
 
Originally posted by TimothyB
I was going to say the same. And the path for the NV30 is using lower quality settings too. But Carmack said it doesn't effect the game visually.

No, I think Carmack did at one point say a sharp eye might at times notice a few minor issues where the dynamic range is quite large enough with the NV30 path, but he certainly downplayed them and stated that the majority of players would never notice them. Could be wrong, though.
 
Originally posted by Badger_sly


There's little doubt ID received some cash from nVidia. It's no different than the $6 million + Valve received from ATI.

ATi paid them for the bundle, not to get better performace as you have implied before. Its not official if Doom3 is bundled with nVidia hardware yet.
 
Originally posted by fallguy
ATi paid them for the bundle, not to get better performace as you have implied before. Its not official if Doom3 is bundled with nVidia hardware yet.

Wow that statement sounded incredibly naive. When you are planning on selling a 50 dollar product and 1 single customer plops down 6 million+ dollars, that customer gets whatever the hell they wish. And you, as a company, are inclined to do whatever you can to make them happy.
 
Originally posted by DocFaustus
that customer gets whatever the hell they wish. And you, as a company, are inclined to do whatever you can to make them happy.


LOL ...I might agree with you but VALVe has CLEARLY taken ATIs money and ran with it as far as I'm concerned. ATi started the whole bundle with the XT. ATi EXPECTED the game to come out Sept. 30. What a joke; HL2 is no longer even on the horizon, latest dates put it at NEXT Sept 30 easily. If i was ATi, I'd go find Mr. Newell, demand my 8 million back or I'd give him a good old fashioned Chicago smiley face.

VALVe totally gave ATi the shaft.
 
Originally posted by DocFaustus
Wow that statement sounded incredibly naive. When you are planning on selling a 50 dollar product and 1 single customer plops down 6 million+ dollars, that customer gets whatever the hell they wish. And you, as a company, are inclined to do whatever you can to make them happy.

Got proof? Didnt think so.

They paid money to have the game bundled with their cards, pretty simple to me. Why would Valve make the game slower on FX's cards on purpose? They wouldnt. In fact, the took extra time making it faster for them.

FX cards suck at PS 2.0, look at Farcry for an example.
 
I dont see why some of you are so wound up over this?Doom3 is but 1 game and not a very good one in my (opinion:p) anyways.For starter's FarCry will shit all over it in my (opinion:p) .And if you want to see how crap Nvidia cards are compared to ATI's offering in that? then please see inclosed screenies;) http://www.beyond3d.com/forum/viewtopic.php?t=10473&start=0

In my way of thinking there is gonna be far more good DX9 games coming out than openGL games and openGL is currently all nvidia is any good at. So i say they can have doom3 all they want as its a deadend title anyways it's boring and unoriginal to say the least.
And when realised will probably look better on ATI anyways lmao:p

God i love freedom of speach lmao:D
 
Its importants because many, many games will use the engine for a long time. Look at CoD... on a modified Q3 engine, from years ago.

OpenGL games usually runs a little faster on a NV cards.
 
Neither nVidia or ATI have the money to "buy" DOOM 3 or Half Life 2.

Believe me I don´t believe Valve or ID software would ever allow to get bought.

Come on excluding 50 % of the market is that a wise move? For some game developers who are known for making great product that satisfy their customers?

It´s all PR. they get sponsored that´s all. their main income will be from us gamers buying their games and not the video card makers so I can´t believe anyone read anything into this.

Why did Valve in that case spend so much time optimizing for nvidia cards and even allow it to play in a specific nvidia mode if they wanted Half Life 2 only run well on ATI cards?

Hardly because they are sponsored by ATI :rolleyes:

And why don´t John Carmack only allow DOOM 3 to run on CG mode :rolleyes:

Hardly because it´s sponsored by nvidia :rolleyes:

The money the get from video card makers is small potatoes compared to what they get from all users that is buying it. That plus the fact I believe game developers loves what they do and want people to see their products in the best fashion possible guarantees it will play well on both ATI and nVidia hardware regardless of marketing scheme.

Come on how many nvidia sponsored games have we seen running better on ATI hardware for example? The same goes for some ATI sponsored games too however in a general fact it is that nVidia video cards don´t score procentually worse than in any other game ;)
 
It makes sence form a speculative standpoint to assume nVida cards will out perform ATi in D3, varing code paths aside, as nVidia's implementation of OpenGL has been a notch above that of ATi's. I think one thing is for sure, ATi cards will run D3 just fine. And if there are issues, ATi will optimize.
 
I'd like to know how any of you are in a position to criticize someone like Carmack yet think everything is all fine and dandy when ATI does the same thing (likely for much more money since it involved packaging the game with cards). Ever consider that the GFFX plain runs Doom3 better? Of course nothing will stop people like fallguy coming it and saying that ATI did their endorsement for performance and NVIDIA did it to “buy” a special code path. Maybe Carmack actually prefers OpenGL to DX9 and NVIDIA cards are the ONLY ones who are going to be running Doom3 on linux. You don't see ATI supporting that.
 
Originally posted by @trapine
I dont see why some of you are so wound up over this?Doom3 is but 1 game and not a very good one in my (opinion:p) anyways.For starter's FarCry will shit all over it in my (opinion:p) .And if you want to see how crap Nvidia cards are compared to ATI's offering in that? then please see inclosed screenies;) http://www.beyond3d.com/forum/viewtopic.php?t=10473&start=0

In my way of thinking there is gonna be far more good DX9 games coming out than openGL games and openGL is currently all nvidia is any good at. So i say they can have doom3 all they want as its a deadend title anyways it's boring and unoriginal to say the least.
And when realised will probably look better on ATI anyways lmao:p

God i love freedom of speach lmao:D
Doom3 is a dead end title? And you've determined this from a handful of screenshots and a movie? Great justification. Even assuming you are correct, expect the Doom3 engine to be used as much as the Quake3 engine. Yes the same engine used in great games like RtCW, ET, SoF2, CoD, and more. Are you going to try to argue now that all derived works using the Doom3 engine are going to suck as well? It wouldn't surprise me considering that it would be about as valid as your Doom3 is a dead end title assumption.

As far a Farcry goes, the demo far from impressed me. Painkiller was much more enjoyable. And the UT2k4 demo beats both. However, I am still looking forward to the retail version of all three games.
 
Originally posted by fallguy
ATi paid them for the bundle, not to get better performace as you have implied before. .................

I implied no such thing. But you are correct on 2 points. ATI paid Valve for the bundle, to sell cards, and, while there was no better performance, to put on the Monster-Cable-like Shaders Days gibberish show.

Of course Valve realizes that their main income will come from selling the game itself, so after taking the money and running, they're finishing up the game to run equally good on ATI and nVidia cards.
 
I could care less about ATI or NVidia brand loyalty, When I purchase hardware, I get what's best at the moment for what I want to do. Which is to game. I've had a voodoo2, voodoo 5 5500, geforce3 ti500, and now an ati radeon 9800 pro. I've been pleased with each, usually because I was an early adopter for each one, and they were pretty big steps performance-wise over each other. </endrant> Er, just trying to say I'm not a fanboi of either.

That being said, I have a great deal of respect for John Carmack, I think he's pretty smart. And not just a good programmer, I think je understands the way the hardware industry works. (even sitting in on many commitees trying to steer them all in the 'right' direction).

I don't pretend to know all the answers or the what the truth in this particular matter is. But I'm -inclined to think- if John Carmack gets behind something, it's because it's good and has future value. They may have also paid iD software $$, but I'd -like to think- that he would have refused it if he didn't want to stand behind the hardware.

Looking at his track record, I think that these are fair assumptions to make. I mean hey, he's already rich, and is trying to make a space ship just because he can. does he really need to whore himself out to make more money? I wouldn't think so.

Maybe I'm wrong, who knows, but that's just how I think it is. I'll trust the Carmack to make wise decisions. And I'll make my own decisions in buying my hardware/video card.

Besides it's not like any of us are going to go and buy an Nvidia card BEFORE we see the benchmarks or how it runs DOOM3. I think most of us will rationally look at the cards available to us, and purchase the one that is best for what we want to do.
 
he prolly did it on account of the fact that an ATI guy leaked the doom 3 alpha. i'd prolly do the same thing. the question is, is there's going to be any noticeable performance drop on account of this agreement. this might have the same effect as "budweiser is the official beer of Id software".

and what about ut2k4 splurgin that nvidia logo at the load up? that's prolly all we'll see "doom 3: nvidia the way it was meant to be played"
 
The only thing that really pisses me off about nVidia is the goddamn "The Way It's Meant To Be Played" logo in games.:rolleyes:

How come ATi doesn't have their own logo in games?
 
Originally posted by obs
I'd like to know how any of you are in a position to criticize someone like Carmack yet think everything is all fine and dandy when ATI does the same thing (likely for much more money since it involved packaging the game with cards). Ever consider that the GFFX plain runs Doom3 better? Of course nothing will stop people like fallguy coming it and saying that ATI did their endorsement for performance and NVIDIA did it to “buy” a special code path. Maybe Carmack actually prefers OpenGL to DX9 and NVIDIA cards are the ONLY ones who are going to be running Doom3 on linux. You don't see ATI supporting that.

I didnt say anyone bought performance. Go read and try again.
 
Originally posted by obs
Even assuming you are correct, expect the Doom3 engine to be used as much as the Quake3 engine. Yes the same engine used in great games like RtCW, ET, SoF2, CoD, and more.


Well I am not so sure on that one. No doubt that we will have some great titles use the D3 engine. No doubt some folks that have done the game in one engine tend to stay with that engine (for example RavenSoft did SOF/JK in Q2, SOF2/JK2/3 in Q3). However the MoH:AA folks have switched to the UT2K3/4. And with decent engines becoming more and more frequent you may see more titles shop around. Nothing against the D3 engine but they may find a better deal elsewhere. I know that the UT2k3 engine goes for about $500,000. Stuff like that COULD mean less D3 titles. Then again it may attrack more. Who's to say :)
 
I honestly can't understand some of the bitching that goes on. Who cares if you see a Nvidia the way its ment to be played logo, who cares if ati doesn't have one. Everyone here see's the benchmarks for the cards and knows what they want to run their games. If nvidia winds up running Doom 3 better then fine, if ATI is a few notches lower and thats what I've got fine. I highly doubt Carmack is trying to be biased on his hardware choices. He's simply making his decision and going with what he likes. Also why even bother speculating on NV's and ATI's next cards and how well they'll perform. More then likely we'll see them before Doom 3 or HL2 any way. I'll go with what ever cards better if I upgrade again when they come out. I'm sure the image quality and frame rate will be fine on both cards. I don't see id shooting themselves in the foot and making Doom3 run at such a higher degree better on Nvidia that ATI fans would be dissapointed. Besides wasn't the Alpha at E3 run on an early release version of the 9800?

http://www.webdog.org/plans/1/
 
Originally posted by Vagrant Zero
Yup, ATI is where the leak originally sprung from.
I am pretty sure it was run on a 9700 pro too.
 
Well Id has poorly handled the whole situation. First with the Doom3 test and now endorsing a product before the game is released. I am also unhappy with a good portion of the Ati/H2 deal...

However it is unfair to blame an Employee of a Corporation. Likely others at the firm influenced the decision. Wording is such that many might infer more then was intended. It states Fx series will play doom3 fine, which was a given.

Anyways recent events:

Ati rumors of r420 ready early
Ati waiting to clear channel of 9800s.
Nvida counter punches...will ship late but with 16 pipelines.
Nvidia tries to stall sales with Doom3 endorsement.
 
I think it's going to be a great battle.

ATI / 1/2 life

vs.

NV/D3


Put on the boxing gloves and lower your prices.
 
It was never confirmed that ATi leaked it. Just rumors, and even more rumors said it wasnt. The fact is, we dont know.
 
This whole thing is a bunch of bullshit...and I fell for it. Back when I wanted to buy a graphics card over a year ago, I figured I should buy a real high powered card cause Doom3 and HL2 were just around the corner and needed such a card. So I bought a G9700-ProM. Hmm...the only thing I play on it is Soldier of Fortune 2 MP...at a 1280x1024 w/8X AA & 16AF :) . I could have gotten by with a Ti4200 and waited for the two games to get realeased and bought a GF8 or a Radean 15000. Never again will I fall for this crap. So ID and Valve can chime in on what ever they want.
 
Back
Top