Valve sucks

Status
Not open for further replies.
tranCendenZ said:

Not really, since the 9600Pro/XT don't get much of a boost from DX8.1, unlike the FX series, and no 9600Pro/XT customer would expect high resolutions and high framerates, since it is a mid-end card, unlike the 5800/5900.
So it seems you misunderstood what I was trying to say. I hope you see what I'm saying now.

Anyway, just look at your own framerates, the 9600Pro gets the exact same framerate in DX9.0 as the 5900 Ultra does in mixed mode. So the 5900 Ultra is getting the framerate of a mid-end card, at reduced image quality. I see no reason for the 9600Pro to be running at reduced quality, the problem is clearly with the 5800/5900s.
 
tranCendenZ said:
Clearly mixed precision should have been implemented for the 5800/5900 series. And note, these were using old, unoptimized for dx9 NV drivers.
That's what a lot of people are forgetting.

xbitlabs and other sites that used the stolen september 2003 build in reviews were showing much better performance improvements on later drivers. Valve specifically told review sites not to use the 50.xx drivers to test the HL2 beta, so the THG and other sites' results from the Sept 2003 beta test were a worst case scenario.

Pointing to that as any kind of proof now would be like saying that [H]'s initial Doom3 results for the 9800 (using official Cat3.4 drivers) proved anything: http://www.hardocp.com/article.html?art=NDc0 .
 
fubar569 said:
all of this fussing back & forth...be thankful you have the game, and it runs...myself, i'm stuck with a 9200SE if and when i decide to sign my life over to valve/steam...

It is playable on my 64 mb ati 900o on my laptop if that's any help?
 
chef-x said:
so how do i undo the mod shown in the thread
which one? if you added -dxlevel 90, you undo it by adding -dxlevel 81

if you did the further "fake 9800" mod (modified dxsupport.cfg), restore your copy of that file and delete config.cfg again.
 
<hasn't read any posts here...>

Personally, I do not hate valve for their ATI bum-loving, everybody has an angle these days. The one thing I wish they would do is fix the damn hitboxes in counter-strike source, the game is slowly heading towards being uninstalled... no point playing if a spray and pray can do just as well as me ....
 
lol i never have any problems with the hitboxes..cause i am always whupping everyone's ass hehe
 
BlastRadius said:
no point playing if a spray and pray can do just as well as me ....
That's what made HL2 playable for me. Aim anywhere close, get a hit. The enemies don't have the same luxury.
 
Hello!

I just wanted to try the method described on the first side of this thread. Now I have two questions, just to be sure.

1. At step 2 it is said, that i have to '''select any file inside the folder half-life 2\bin'', wehre i have to enter the .DLL path. How can i do this, I can just mark one file?

2. At the end it is said, that i have to tipe in the section ''DIRECTX DEVICE ID'S the ATI Vendor and Device ID''. Do i have to do this, although I have a NVIDIA card? And if so, which one should i choose? Radeon 8500 or 9800Pro?

Many thanks for your help. :)
 
pxc said:
That's what made HL2 playable for me. Aim anywhere close, get a hit. The enemies don't have the same luxury.
Wait, that's a GOOD thing?!
 
guys, this is all about MONEY man....if you haven't figured that out...all of us know that ATI's card sucks at DOOM3 engine or any OpenGL games....i am sure ATI knows about that too, but they just sit on their lazy butts and not give us reasonable OpenGL drivers....

Anyways, since Nvidia's going to advertise their cards with Doom3 all the time....ATI won't stand a chance if they don't "ally" with Valve. What's interesting is if someone can dig through Valve or ATI's financial statements to figure out how much money ATI paid Valve for all this...I would expect something in the 10-20 million dollars area...(since HL2 is probably going to sell about 5million copies, and ATI probably think it's worth it to support Valve for 2-4 dollars per copy.

and I hate this fucking steam thing..it killed off Counter Strike.....good old 1.5 is no longer playable...(which SUCKS!!!!!!!!!) is it just me or is STEAM SLOW like piece of cow dung even on my duallie Athlon?

another thing to point out is that ATI isn't doing too well after the release of Radeon 9700 pro(and 9500 pro)....now those two cards were awesome at the time because the FX line sucked butt...but if you look at what ATI did after that:

1. Radeon 9600Pro(slower than 9500Pro...)
2. Radeon 9600SE, 9200SE, 9200....what the heck is going on???going back in performance?
3. Radeon 9800Pro(or I call it a Radeon 9700Pro with a die shrink cause ATI can't put out much better GPUs
4. Radeon X800 Pro --only 12 pipes
5. Radeon X800XT ---where can you find them and are you willing to pay w/ your arm and leg
6. Radeon X800XL -- X800Pro was rocked by GT so they have to paper launch a product line to delay purchase of Nvidia's cards....
7. Never delivered the rewritten OpenGL driver....

so, I would consider ATI's success on just the introduction of the Radeon 9700 Pro core......and times have changed...I would argue that ATI is about half a year behind Nvidia's release schedule for their product lines....if you look at what Nvidia's doing: SLI, 6800GT, 6600GT, 6800...they are all great cards...and the 6200 isn't that bad either.....and at least NV cards works in Linux and freebsd!!!!

if you think my analysis of the situation is accurate, please set aside company loyalty , and vote with your wallet next time. I have absolutely no loyalty to any company, I just buy whatever is a better product....
 
hi i get this 3 points and 3D analyze get me error CREATEPROCESS FAILED .I have P4 1.8Ghz , 512 Ram, Geforce FX 5600 ultra and i have XP drivers 61.77 and i have 40 fps on low details with Direct 8.0

I want get more fps. then more beauty graphic.
I want get around 70 fps and i dont know how it get!

sry my English not good.....

Plz post screen how i setup 3D Analyze.Thx..


i get these points:

1. select HL2.exe file in half-life 2 folder
2. select any file inside the folder half-life 2\bin
3. select Steam.exe
than check these options:
- Under the section Pixel and Vertex Shader: FORCE LOW PRECISION PIXEL SHADER
- Under the section Remove stuttering: PERFORMANCE MODE
- on the bottom left: FORCE HOOK.DLL


In the end 3D ANALYZE gives me an error, CREATEPROCESS FAILED

My icq is 320829923 plz help THX all
 
tshen_83 said:
guys, this is all about MONEY man....if you haven't figured that out...all of us know that ATI's card sucks at DOOM3 engine or any OpenGL games....i am sure ATI knows about that too, but they just sit on their lazy butts and not give us reasonable OpenGL drivers....

If it was that simple to build OGL drivers as good as NVIDIA's, then I'm sure the others would have done so.

Anyways, since Nvidia's going to advertise their cards with Doom3 all the time....ATI won't stand a chance if they don't "ally" with Valve. What's interesting is if someone can dig through Valve or ATI's financial statements to figure out how much money ATI paid Valve for all this...I would expect something in the 10-20 million dollars area...(since HL2 is probably going to sell about 5million copies, and ATI probably think it's worth it to support Valve for 2-4 dollars per copy.

In case you didn't know, ATi gives free copies of Half-Life 2 with their videocards. But ATi still has to buy them from Valve obviously, so they paid Valve.

1. Radeon 9600Pro(slower than 9500Pro...)
2. Radeon 9600SE, 9200SE, 9200....what the heck is going on???going back in performance?

It's called low-budget. It's not about performance, it's about making it cheap. NVIDIA does the same thing, introducing 5800 first, then 5200 and 5600. And again, 6800 first, then 6600 and 6200. And they still sell many older models aswell. The worst example is probably the GF4MX, which is actually based on the original GeForce 256 architecture.

3. Radeon 9800Pro(or I call it a Radeon 9700Pro with a die shrink cause ATI can't put out much better GPUs

Nothing wrong with that. Just means that people get slightly better products when they buy later.

4. Radeon X800 Pro --only 12 pipes

NVIDIA offers plenty of products with 12 or less pipelines aswell. Which is good, for people who don't want to buy the most expensive high-end cards.

5. Radeon X800XT ---where can you find them and are you willing to pay w/ your arm and leg

Availability is a problem yes, but the price is no worse than the 6800Ultra

6. Radeon X800XL -- X800Pro was rocked by GT so they have to paper launch a product line to delay purchase of Nvidia's cards....

That's just clever marketing, NVIDIA did the same with their FX series back in the day. Nothing wrong with that. If you're smart enough, you know what to buy anyway.

7. Never delivered the rewritten OpenGL driver....

You actually have a point here. So far they only did specific optimizations for Doom3 it seems.

so, I would consider ATI's success on just the introduction of the Radeon 9700 Pro core......and times have changed...I would argue that ATI is about half a year behind Nvidia's release schedule for their product lines....if you look at what Nvidia's doing: SLI, 6800GT, 6600GT, 6800...they are all great cards...and the 6200 isn't that bad either.....and at least NV cards works in Linux and freebsd!!!!

Yes, NVIDIA does have some nice stuff out. On the other hand, the X800XTPE (if you can find one) does perform great, and has excellent image quality. For most people it's still good value for money (most people aren't interested in SLI or linux/freebsd anyway). So ATi isn't behind in the things that matter to most people.

I agree, just buy whatever is best, don't buy something because of the brand that's on it.
I just don't think that NVIDIA is that much of a better deal for most people.. not like the 9700 was compared to the FX anyway.
 
Scali said:
Not really, since the 9600Pro/XT don't get much of a boost from DX8.1, unlike the FX series, and no 9600Pro/XT customer would expect high resolutions and high framerates, since it is a mid-end card, unlike the 5800/5900.
So it seems you misunderstood what I was trying to say. I hope you see what I'm saying now.

You call the 4fps boost that the 5900 Ultra gets in that benchmark from mixed mode to DX8.1 a big boost? (this is according to Valve's benchmarks). The 5900 Ultra, according to their benchmarks, gets 47-48fps in mixed mode DX9 and 51-52fps in DX8.1

Anyway, just look at your own framerates, the 9600Pro gets the exact same framerate in DX9.0 as the 5900 Ultra does in mixed mode. So the 5900 Ultra is getting the framerate of a mid-end card, at reduced image quality. I see no reason for the 9600Pro to be running at reduced quality, the problem is clearly with the 5800/5900s.

lol. if the 5900 Ultra in mixed mode dx9 gets the same framerate as the 9600PRO in dx9 (playable framerates) and little boost from dx8.1 why shouldn't it be programmed to have the option of running at dx9 detail level by using partial precision?
 
tranCendenZ said:
You call the 4fps boost that the 5900 Ultra gets in that benchmark from mixed mode to DX8.1 a big boost? (this is according to Valve's benchmarks). The 5900 Ultra, according to their benchmarks, gets 47-48fps in mixed mode DX9 and 51-52fps in DX8.1

The argument was that 9600Pro/XT should run DX8.1 instead of the current path. The current path is DX9.0, not mixed mode.
And my answer to that was: "No, it doesn't matter a lot in performance".
The same case does matter for 5900U. DX9.0 is much slower than either mixed mode or DX8.1, so therefore in that case the answer would be yes.
You're just trying to twist things around. The boost I spoke of was from DX9.0 to DX8.1, mixed mode does not apply to Radeons obviously.

lol. if the 5900 Ultra in mixed mode dx9 gets the same framerate as the 9600PRO in dx9 (playable framerates) and little boost from dx8.1 why shouldn't it be programmed to have the option of running at dx9 detail level by using partial precision?

Again you are twisting things around. I was talking about how the 9600Pro doesn't have a problem with running in DX9.0 mode, I didn't say anything about what mode the 5900U should be running in.

If you ask me, people who bought a 5900U should complain to NVIDIA, not Valve. The real problem is not that Valve chose not to use partial precision, the problem is that this choice makes the 5900U perform like a $99 budget-card from ATi, when running the exact same code!
You should ask yourself how that is possible. The answer can't be that Valve is screwing NVIDIA, since they run the same code as the ATi cards, standard DX9.0 code.
The answer comes from NVIDIA itself: the 6800 series DOES perform properly with the standard DX9.0 code, so there must be something wrong with the FX series.
 
This thread will not die. Sorry to all the FX owners who bought a POS DX9 card.
 
Scali said:
The argument was that 9600Pro/XT should run DX8.1 instead of the current path. The current path is DX9.0, not mixed mode.
And my answer to that was: "No, it doesn't matter a lot in performance".
The same case does matter for 5900U. DX9.0 is much slower than either mixed mode or DX8.1, so therefore in that case the answer would be yes.
You're just trying to twist things around. The boost I spoke of was from DX9.0 to DX8.1, mixed mode does not apply to Radeons obviously.

Well Scali guess what, that is the whole point of this thread. Valve should have kept in the mixed mode path that they programmed for FX cards because its not much slower than dx8.1 mode and makes DX9 playable on FX5900 cards.

Again you are twisting things around. I was talking about how the 9600Pro doesn't have a problem with running in DX9.0 mode, I didn't say anything about what mode the 5900U should be running in.

If you ask me, people who bought a 5900U should complain to NVIDIA, not Valve. The real problem is not that Valve chose not to use partial precision, the problem is that this choice makes the 5900U perform like a $99 budget-card from ATi, when running the exact same code!

And any company right now could make a game with only SM3.0 and SM1.1 paths that would make the x800xtpe look like a $50 MX4000 budget card from Nvidia. After all, the x800xtpe does not comply with the latest DX9 standards :). Point? It's not about what is "standard," its about what D3D features companies choose to program for. If it is in the DX9 spec, it is a DX9 standard, period.

You should ask yourself how that is possible. The answer can't be that Valve is screwing NVIDIA, since they run the same code as the ATi cards, standard DX9.0 code.
The answer comes from NVIDIA itself: the 6800 series DOES perform properly with the standard DX9.0 code, so there must be something wrong with the FX series.

Partial precision is part of the DX9 spec. Valve just programmed their shaders inefficiently and did not take the time to optimize them for partial precision (or at least, not in the final version). Honestly I'm not suprised as with all the stuttering/crash bugs the game/engine seems sloppily programmed at best. Unfortunately it seems other source engine games are affected as well with these bugs, such as the new Vampire game.
 
tranCendenZ said:
Well Scali guess what, that is the whole point of this thread. Valve should have kept in the mixed mode path that they programmed for FX cards because its not much slower than dx8.1 mode and makes DX9 playable on FX5900 cards.

You must be one of those idiots who actually bought the card then.
Valve clearly dropped the mixed mode path because it had precision problems.
Whichever way they would have chosen, they'd be criticized by FX owners. The game would either look bad or perform badly. That's not Valve's fault, that's NVIDIA's fault.

And any company right now could make a game with only SM3.0 and SM1.1 paths that would make the x800xtpe look like a $50 MX4000 budget card from Nvidia. After all, the x800xtpe does not comply with the latest DX9 standards :). Point? It's not about what is "standard," its about what D3D features companies choose to program for. If it is in the DX9 spec, it is a DX9 standard, period.

It's not a point of Valve not supporting the FX series. It's a point of the FX series not being suited to the standards they try to support. As I said above, the game would either look bad or perform badly. The X800XTPE has no such problems, all standards it supports, perform well, and as I said before, regular SM2.0 code would work fine on it. Since there will not be any games that won't support SM2.0 anytime soon, there will never be a problem for the X800XTPE, it just supports a very common standard.
So there are 2 problems with the FX series:

1) The lower precision and relatively bad performance on SM2.0 shaders compared to other SM2.0 cards (including NVIDIA's own).
2) The most popular FX cards are the low end ones, 5200/5500/5700, which are so slow at SM2.0, that it's pretty much impossible to apply SM2.0 to a game. The fast cards get acceptable performance at low precision, but these cards are so rare (less than 3% of all gamers, apparently), that it's hard to justify the extra development cost.

Partial precision is part of the DX9 spec. Valve just programmed their shaders inefficiently and did not take the time to optimize them for partial precision (or at least, not in the final version). Honestly I'm not suprised as with all the stuttering/crash bugs the game/engine seems sloppily programmed at best. Unfortunately it seems other source engine games are affected as well with these bugs, such as the new Vampire game.

If anything, the Source engine is fast. It gets very good framerates even from relatively simple hardware, such as the 9600Pro.
It gets higher framerates than eg FarCry or Doom3... the only problem is those damn FX chips that are useless at SM2.0.

Oh, and since you are able to judge the efficiency of Valve's shaders... what was the last shader title you released?
 
Scali said:
If it was that simple to build OGL drivers as good as NVIDIA's, then I'm sure the others would have done so.



In case you didn't know, ATi gives free copies of Half-Life 2 with their videocards. But ATi still has to buy them from Valve obviously, so they paid Valve.



It's called low-budget. It's not about performance, it's about making it cheap. NVIDIA does the same thing, introducing 5800 first, then 5200 and 5600. And again, 6800 first, then 6600 and 6200. And they still sell many older models aswell. The worst example is probably the GF4MX, which is actually based on the original GeForce 256 architecture.



Nothing wrong with that. Just means that people get slightly better products when they buy later.



NVIDIA offers plenty of products with 12 or less pipelines aswell. Which is good, for people who don't want to buy the most expensive high-end cards.



Availability is a problem yes, but the price is no worse than the 6800Ultra



That's just clever marketing, NVIDIA did the same with their FX series back in the day. Nothing wrong with that. If you're smart enough, you know what to buy anyway.



You actually have a point here. So far they only did specific optimizations for Doom3 it seems.



Yes, NVIDIA does have some nice stuff out. On the other hand, the X800XTPE (if you can find one) does perform great, and has excellent image quality. For most people it's still good value for money (most people aren't interested in SLI or linux/freebsd anyway). So ATi isn't behind in the things that matter to most people.

I agree, just buy whatever is best, don't buy something because of the brand that's on it.
I just don't think that NVIDIA is that much of a better deal for most people.. not like the 9700 was compared to the FX anyway.


dude, you have a problem with my analysis or something??? you don't really have to find opposing opinions on every line I type man.....I already said, vote with your wallet, not here....
 
tshen_83 said:
dude, you have a problem with my analysis or something??? you don't really have to find opposing opinions on every line I type man.....I already said, vote with your wallet, not here....

I'm just pointing out that most of the 'reasons' you give for not buying ATi are false, since they apply to NVIDIA just aswell.
So yes, I do have a problem with your analysis. And I thought I would explain where these problems were, so people don't take your false reasons into account when they're ready to vote with their wallet.
 
Guilty of being a normal business, innocent of the bs nv fan-boys keep tossing their way.

~Adam
 
patEtik:

With DirectX 8.0, the partial precision hack will not help you anyway because you already running at 16-bit (partial precision), so I wouldn't waste the time.

Nightkin:

For your number 2 question, read post number 544. You have to wade through my blathering as it was in the middle of another flame war, but here is the important part:

Optimus said:
On the ATI ID issue:

A previous post of mine was missed apparently due to being in the midst of the flame war. Here it is in it's entirety:
Optimus said:
I went to http://forums.guru3d.com/showthread.php?threadid=115614
and followed the directions and saw no speed improvement either way with my NVidia GeForce FX 5950 Ultra (Flashed 5900). However I have localized the water fix. On my card water would never show in dx9.

Deleting the line labeled "NoUserClipPlanes" fixed this water problem nicely.

I have not tried limiting to fp16 shader performance yet, but, otherwise, I see no speed benefit to straight up dx9 in HL2 on an FX series video card. ;)

While I was working on this it became relatively apparent how HL2 recognizes your video card. It does so by Vendor ID and Device ID. (Proof: I ran into a problem with this while attempting the fix. There are two listings for the NVidia GeForce FX 5950 Ultra. The first is Device ID 0x310 and the second is 0x333. Mine is the latter but I started with the former. When I changed the settings in the former nothing would happen, but when I changed them in the latter it started working.)

When you follow the instructions specified by http://forums.guru3d.com/showthread.php?threadid=115614, you are not rebadging your video card. In fact, as far as HL2 is concerned, you are deleting it from the list, forcing HL2 to go with the default settings at the very bottom of the file. The settings at the very bottom of the file are generic and thus do not incorporate any specific settings such as "NoUserClipPlanes." This tells HL2 to treat your video card as a standard, dx9 compliant, Nvidia branded video card. It is basically assuming that you have an Nvidia card from the future... as stupid as that sounds.

So the fact of the matter is that Valve did not cripple NVidia video cards. They optimized them for dx8.1, thus crippling them in dx9.0.

It was nothing more than an oversight on their part. It is a forgivable mistake.

On your number one question, I would suggest reading patEtik's post, number 772.

---------------------------------------------

Scali said:
You must be one of those idiots who actually bought the card then.

I thought you were being very polite before when you helped me find the mixed mode path. What happened?

Scali said:
Valve clearly dropped the mixed mode path because it had precision problems.

Optimus in post #699 said:
Good News! It worked! "-dxlevel 82"

They didn't drop it completely. They just made a flawed business decision. It happens all the time. Xerox sold their mouse technology to Apple. IBM sold DOS to Microsoft. The government sold their souls to Microsoft. etc.

I personally believe that they were pressured by ATI to drop the mixed mode path because it was producing the same quality images on a FX5900 as the Rad9800 at higher framerates. This is not a stretch either, because both companies have done this in the past.
 
Optimus said:
Xerox sold their mouse technology to Microsoft. IBM sold DOS to Microsoft.

Ok, wtf....

Xerox sold their GUI tech to Apple, MS lisenced DOS to IBM.

Stop smoking crack kthxbye.
 
Gillette said:
Stop smoking crack kthxbye.

How old are you, n00b?

Gillette said:
Xerox sold their GUI tech to Apple

My bad. Will fix.

Gillette said:
MS lisenced DOS to IBM.

Actually, unless I'm mistaken, IBM sold it to MS before MS modified it and licensed it to them, as per their contract with IBM.
 
Gillette said:
Ok, wtf....

Xerox sold their GUI tech to Apple, MS lisenced DOS to IBM.

Stop smoking crack kthxbye.


Xerox licensed them the technology of the mouse., Bill Gate's saw this and stole it from Apple.

Bill Gates originally said there would never ever be a need for more than 640k memory.
 
Optimus said:
patEtik:
I personally believe that they were pressured by ATI to drop the mixed mode path because it was producing the same quality images on a FX5900 as the Rad9800 at higher framerates. This is not a stretch either, because both companies have done this in the past.

The IQ of the FX series is no where near the IQ of a 9500+ series.
the AF and AA is much better.
Maybe they dropped it because they didn't want to seem like they were bending over for a crappy video card series?
No matter how you cut the cake, the FX series was a flop.
You nvidia fanb0ys need to realize that.
I dont hate nvidia for it, I am usiing an nvidia card atm, but the FX series seriously is a piece of crap, to put it lightly.
I'm not sure how a FX series is going to get equal quality when you have to use FP32 in places where FP16 isn't enough- and that's where the FX series would be really slow.
Even mixed mode performance wasn't that great- compareable to a 9600 pro are lower res, and slower at higher res.
http://anandtech.com/video/showdoc.aspx?i=1863&p=8
 
phaelinx said:
Xerox licensed them the technology of the mouse., Bill Gate's saw this and stole it from Apple.

Bill Gates originally said there would never ever be a need for more than 640k memory.
false and false. Xerox didn't invent the mouse. Apple offered 100,000 shares for the general use of the information gained (GUI and networking) from the PARC tour, and hired people from PARC (Bruce Horn and Jef Raskin). But yes, Apple did license some ideas from Xerox. 3 years before Apple released the Lisa, a company named Three Rivers Computer Corporation introduced the Perq graphical workstation. It's nice to think Apple was first, but they weren't. In fact Three Rivers had a second generation GUI computer (Perq T2) even before Apple had their first Mac.

And Gates never said that, but that doesn't stop people from repeating it. Quick question: what day and where did he say it? LOL It's easy to source the truth, and much, much harder to source something that wasn't said.
 
Moloch said:
The IQ of the FX series is no where near the IQ of a 9500+ series.

On this point, though it is slightly off topic, I must respectfully disagree. I have an FX card and I have yet to see a card that looks better than it at the same resolution, AA, AF. Granted, other cards will have a higher framerate, but while framerate is important, I believe it was ATI themselves that stated that framerate isn't everything.

Moloch said:
the AF and AA is much better.

I never argued this point. I'd be the first to agree. I don't even use either in games because it slows my system to a slide show.

Moloch said:
Maybe they dropped it because they didn't want to seem like they were bending over for a crappy video card series?

Finally, you're on topic. I really don't think so why drop something entirely when it's almost finished. I'd understand dropping it until the first or second patch, but the only thing missing from the "-dxlevel 82" path that I've seen so far is full reflections in water. This actually suggests that they were bending over backwards for ATI rather than avoiding the notion towards nVidia.

Moloch said:
No matter how you cut the cake, the FX series was a flop.

Could not agree more. In fact, I think it is not humanly possible to agree with this statement more than I do at this moment. This may be why my other nickname is Captain Obvious.

Moloch said:
You nvidia fanb0ys need to realize that.

I'd honestly like to know when we will get some mature adults into this discussion. Do I go around calling those, like yourself, who clearly prefer ATI ATwIts or something equally assinign? No, I don't. This is because I am a big boy now and use words that you can look up in a dictionary to prove a point, rather than lobbing mindless insults to avoid the issue since you can't prove your argument.

You need to grow up.

Moloch said:
I dont hate nvidia for it, I am usiing an nvidia card atm, but the FX series seriously is a piece of crap, to put it lightly.

I agree. Yadda yadda yadda. Insert comment about wasting time and being off topic. Get mad and flame me. Move on.

Moloch said:
I'm not sure how a FX series is going to get equal quality when you have to use FP32 in places where FP16 isn't enough- and that's where the FX series would be really slow.

Now there is a flaw in this logic. If one only uses FP32 in places where FP16 does not look the same as FP24, then the FX series will look better, simply by the definition of these terms. I grant you that it may not run as fast as an ATI card of the same technology level, but it will approach it, and according to the leaked beta of HL2, at least in my experience, it was faster. I have noticed something odd about the leaked beta benchmarks and the official beta benchmarks. Even on tests where the ATI cards seem to increase in performance, the ratio of percent change in fps of ATI vs. nVidia is not a 1:1 ratio... Interesting... (In other words, ATI gets a bigger percentage of performance boost than nVidia does, when that is theoretically impossible.)
Evidence:
http://www.digit-life.com/articles2/digest3d/1003/itogi-video-hl2_2-w2k.html
http://www.digit-life.com/articles2/digest3d/1003/itogi-video-hl2_1-w2k.html
http://anandtech.com/video/showdoc.aspx?i=1863&p=8

Moloch said:
Even mixed mode performance wasn't that great- compareable to a 9600 pro are lower res, and slower at higher res.
http://anandtech.com/video/showdoc.aspx?i=1863&p=8

I have this same problem with people who misquote the Bible. If you don't pay attention to context, then you might have heard one of the great Dr Martin Luther King, Jr.'s speeches and thought he was a racist, or you might hear a football coach's speech and think we're at war with some enemy country of tigers or elephants... a little Alabama/Auburn humor there for you.

If you go to the page just previous to your link there, you may notice the following statement on the very first line:

Anand Lal Shimpi said:
Valve had very strict requirements about the test systems they let us use.

Now why would Valve require a certain test system? And why only Intel? Why not AMD? What was Valve afraid of?
 
Optimus said:
Finally, you're on topic. I really don't think so why drop something entirely when it's almost finished. I'd understand dropping it until the first or second patch, but the only thing missing from the "-dxlevel 82" path that I've seen so far is full reflections in water.

Why drop something after its been in development? Maybe they did not need it any more? The nV30 coded path was in Doom3 for a number of years but dropped at the end of Doom3's development. Granted there were different reasons why the nv30 code path was dropped but it shows you that it DOES happpen in the industry. Anyways lets all just let this thread die :)
 
Optimus said:
Finally, you're on topic. I really don't think so why drop something entirely when it's almost finished. I'd understand dropping it until the first or second patch, but the only thing missing from the "-dxlevel 82" path that I've seen so far is full reflections in water. This actually suggests that they were bending over backwards for ATI rather than avoiding the notion towards nVidia.

I don't quite see how you get to that conclusion?

I have noticed something odd about the leaked beta benchmarks and the official beta benchmarks. Even on tests where the ATI cards seem to increase in performance, the ratio of percent change in fps of ATI vs. nVidia is not a 1:1 ratio... Interesting... (In other words, ATI gets a bigger percentage of performance boost than nVidia does, when that is theoretically impossible.)

Why would that be theoretically impossible? NV and ATi use different architectures... For example, if one has more shading power than the other, and you modify the code of a shader, then the effect may be larger on one architecture than on the other.

Now why would Valve require a certain test system? And why only Intel? Why not AMD? What was Valve afraid of?

Why would fear be the reason? Perhaps they just wanted to avoid some problems they encountered during development with these systems (after all, the product was not finished yet, so perhaps there were still some unsolved problems on certain systems).
Why all these conspiracy theories anyway?

I prefer to stick to facts, and one fact is that HL2 is a very shader-heavy game, and ATi cards have far more powerful shaders than most NV cards (at least SM2.0), especially the FX range. Anyone can verify that for themselves... Download the DirectX SDK, write some shaders, and benchmark them (that way you know there won't be any driver 'optimizations')... and you'll see exactly what you see with HL2.
Another fact is that NV controls about half the market of Valve's target audience, so if Valve would deliberately favour ATi over NV heavily, they'd shoot themselves in the foot bigtime. It is in Valve's best interest to make the game run as good as possible on NV cards.
 
Scali said:
I don't quite see how you get to that conclusion?

You're right. I skipped a step or two.

Optimus (Grammatically Corrected) said:
Finally, you're on topic. I really don't think so. Why drop something entirely when it's almost finished? I'd understand dropping it until the first or second patch, but the only thing missing from the "-dxlevel 82" path that I've seen so far is full reflections in water. This actually suggests that they were bending over backwards for ATI rather than avoiding the notion towards nVidia.

Since "-dxlevel 82" is implemented for all intents and purposes with the exception of full reflection on water, it would only make sense that they would finish this part and then release a patch which enabled mixed mode at default. Finishing this is no light task, but it is neither impossible nor hard. It is simply a matter of placing the very same camera view shader on the water with the camera placed at the correct angle with respect to the players view and then running the water shader on that output. The hardware is doing all the work. The programmer only needs to optimize the code. But I do not believe that any major optimization is necessary since the two are probably already optimized individually for dx9. Reflection is hard. Unreal has been doing it for nearly a decade.

Scali said:
Why would that be theoretically impossible? NV and ATi use different architectures... For example, if one has more shading power than the other, and you modify the code of a shader, then the effect may be larger on one architecture than on the other.

On this, I concede. It could simply be that the shaders used in that particular test were better optimized for ATI hardware in their state at that time. I still find it fishy.

What I meant by "theoretically impossible" is that if an unoptimized shader that uses standard dx9 calls is implemented in either FP16 or FP24, it should theoretically run with the same differential speed no matter the complexity of the shaders used. In other words, a six pipelined GPU should, theoretically, always run 1.5 (6/4) times as fast as a four pipelined GPU.


Scali said:
Why would fear be the reason? Perhaps they just wanted to avoid some problems they encountered during development with these systems (after all, the product was not finished yet, so perhaps there were still some unsolved problems on certain systems).
Why all these conspiracy theories anyway?

Yet, again, I must concede to this point as well. I have a habit of assuming the worst when dealing with big companies. That's one of the reasons I'm disappointed in nVidia. They had the gall to say, "We think that gamers today consider [monstrously] large cooling accessories as a sort of status symbol." Yeah, when it's not required to get a standard framerate.

Scali said:
I prefer to stick to facts, and one fact is that HL2 is a very shader-heavy game, and ATi cards have far more powerful shaders than most NV cards (at least SM2.0), especially the FX range. Anyone can verify that for themselves... Download the DirectX SDK, write some shaders, and benchmark them (that way you know there won't be any driver 'optimizations')... and you'll see exactly what you see with HL2.

While I understand the fundamentals of what you are trying to say, you should be more specific. What I believe you mean is that ATI cards are much more powerful with DirectX shaders. On the other hand, nVidia cards are much more powerful with OpenGL shaders. Anyone can verify that for themselves... Download the OpenGL SDK, write some shaders, and benchmark them (that way you know there won't be any driver 'optimizations')... and you'll see exactly what you see with Doom 3, Unreal Tournament 2004, and Far Cry... Oh wait! UT2k4 and FC are DirectX games... hmmm...

Scali said:
Another fact is that NV controls about half the market of Valve's target audience, so if Valve would deliberately favour ATi over NV heavily, they'd shoot themselves in the foot bigtime. It is in Valve's best interest to make the game run as good as possible on NV cards.

Exactly, and all I'm asking is that they do this. They should go back and finish their mixed mode path so that nVidia GPUs can get the same IQ as their ATI counterparts, even if at a slower framerate.

maxius said:

Yeah, I haven't been able to get Offline mode working since HL2 came out. I tried everything their offical support forum says to do, but they won't help me beyond that. I think that is illegal in its entirety. Of course its probably because you could probably take that saved data file that makes it run in offline mode and use it on any computer for single player or multiplayer without internet access.
 
Status
Not open for further replies.
Back
Top