Valve sucks

Status
Not open for further replies.
An engine is the renderer, sound, networking, etc. It is not game code, water is not part of an engine, ai is not part of an engine...

Anyway after the whole shader day debacle this doesnt surprise me, valve is a company without morals. I dont mind some one saying that running fp16 isnt directx9 but not allowing users to enable dx9 with fp16 rendering when their cards can handle it is just deceitful.
 
Jbirney said:
Value did have a mixed mode for FX cards. They showed the Mix mode results during the shader day event last year. They also showed the results of runing it in full DX9 mode. In the end they said it was faster to run HL2 on an FX card in DX8 (8.1) mode. Thats something we all knew about LAST year. Why is this news now? Are you sure when your force something to run your not breaking something else? I mean you force to use ATI ID and then Force 3D analyize to run something? Are you 100% sure thats not breaking something in the chain? Are those results vaild? Are you 100% sure the IQ is the same? But no instead of trying to make 100% sure on all of this you start a post. Nice :confused:

I do hope that someone out there is working on the comparison. I would myself if I weren't at work.

This post is informative. Noone should rag on him just because he's posting uncorroborated information. If its true I'll be thankful for getting something more to mess around with.
 
jon67 said:
I think you need to clarify your perception of what an "engine" is. I guess your definition of "engine" excludes everything that HL2 does better than D3 , i.e. textures, water, physics simulation, AI, long-distance rendering... everything but the lighting of course, since D3 is nothing but lighting effects.
Well there's so much we don't know about the Doom3 engine. Water and long distance rendering being two of them. However, how in the world are the source physics any better than Doom3's? Being able to pick stuff up isn't part of the physics engine. It's a manipulation of the physics engine. The same goes with other stuff like the gravity gun. In fact, with Doom3's deformable levels their physics engine could be called better.

Textures are also engine independent for the most part. If you don't believe me do a little google searching and you can find someone using the Valve textures in a Doom3 map.

As far as outdoor maps, I don't see anything limiting Doom3 at all. I guess all I really have to go on is their history, which in case you are unaware of, if spectacular.

Facial animations is another thing Valve does very well, although Doom3 is pretty damn good too.

Anyways, sorry to hijack this thread even more.
 
jon67 said:
I think you need to clarify your perception of what an "engine" is. I guess your definition of "engine" excludes everything that HL2 does better than D3 , i.e. textures, water, physics simulation, AI, long-distance rendering... everything but the lighting of course, since D3 is nothing but lighting effects.

The Source engine doesn't "do" textures better. That is a silly statement. Both games have maps that are ballancing graphical features that use texture memory (e.g. the textures themselves, bump mapping, etc.); it is just a question of how whoever made the map & associated textures decides to mix them.

Put another way, people have made test maps using Source engine textures in Doom 3. They look the same as the Source engine maps, except for the better lighting the Doom 3 engine can do. On the other hand, they aren't doing as many of the other graphical gimmicks Doom 3 can do either (e.g. piles of bump mapping *everywhere*). Of course, HL2 maps are rendering very different KINDS of environments than Doom 3 maps, and how the engines were used reflects that.

The HL2 engine doesn't have better physics than the Doom 3 engine does. If anything, it is worse... notice all the clipping problems in HL2 that don't exist at all in Doom 3? HL2 made physics part of the gameplay in the maps by making almost everything weigh only 5 pounds and giving you a gravity gun (which was fun, BTW), Doom 3 didn't.

HL2 AI wasn't that great. It was basically the Doom 3 Z-Sec AI with hand grenades, maybe somewhat more clever but nothing to write home about. More significantly for this discussion, AI isn't in the engine, it is scripted for each critter... I don't think either game has a "real AI" component to their engine.

Long-distance rendering... I dunno. Some of the areas in Hell had fairly long view distances. Neither one had the kinds of line-of-sight seen in Far Cry. That doesn't mean that Doom 3 or the Source engine COULDN'T do it, however.

Water... Doom 3 didn't have water, because it wasn't appropriate to the setting. That doesn't mean it CAN'T do water.
 
So, coming from an FX 5900, would I be correct in assuming that this has no effect or bearing on nVidia's 6800 series?
 
Anyone try this on a 6800 yet? I'd be interested to see if it has any effect.
 
^eMpTy^ said:
Anyone try this on a 6800 yet? I'd be interested to see if it has any effect.
I'll try it at 1600x1200. I'm CPU limited at 1024x768. :p

---------------------------------
space reserved for results
---------------------------------
 
pxc said:
I'll try it at 1600x1200. I'm CPU limited at 1024x768. :p

---------------------------------
space reserved for results
---------------------------------

I'm at work all day, I'll give it a go when I get home...
 
idea:
would it be possible to run HL2 on the D3 engine? this would clear up pretty quickly which was "better" overall, even tho it would almost certainly take a little work to do.
 
starhawk said:
idea:
would it be possible to run HL2 on the D3 engine? this would clear up pretty quickly which was "better" overall, even tho it would almost certainly take a little work to do.

...that's ludicrous for about 2 dozen reasons...
 
Lord of Shadows, obs, CastleBravo.

Thanks for your replies, however, I asked for a clarification of the term "engine" when used wrt gaming.
 
lol at the people trying to defend the fx line...face it the line was inferior to everything else out....stop trying to blame valve for it's garbage performance in dx 9 games :rolleyes:
 
^eMpTy^ said:
I'm at work all day, I'll give it a go when I get home...
I can't get 3D-Analyze to run the Steam version of HL2. :mad: The Steam version doesn't allow running HL2.exe directly (gives errors), but instead requires:

Steam.exe -applaunch 220

to start it. The GUI of 3D-Analyze doesn't allow parameters and the batch file is acting brain-dead, too. Gonna try one more thing, but it looks like the CD/DVD version of HL2 is required to do this hack.
 
why isn't this linked on the front page?

hello, news guy?
 
jon67 said:
I think you need to clarify your perception of what an "engine" is. I guess your definition of "engine" excludes everything that HL2 does better than D3 , i.e. textures, water, physics simulation, AI, long-distance rendering... everything but the lighting of course, since D3 is nothing but lighting effects.

Its not a perception, it what an engine is, the engine is the windows api calls, the opengl/direct3d api calls, the directsound / sound api calls. Maybe even some basic game related code like collision detection and such. Its the basic parts of the game that is independant of the game and common to all games.

Heres a good definition I found with a little google work...
http://www.aspfree.com/c/a/Code-Examples/Creating-an-Engine-for-Games-for-Windows/1/

lol @ long-distance rendering, you speak of it as if it were some sort of special trick to it. To make a large map, you make a large map, nothing special has to be done.
 
gordon151 said:
Less yapping, more testing. Has any of you guys actually tested this?

I visited the Guru3d and nVidia forums, and a lot of people posted that it's working for them.
 
gordon151 said:
Less yapping, more testing. Has any of you guys actually tested this?
Look at the first post. I can't test because I have the steam version. :mad: I explained why it doesn't work a few posts above.

I really want to see how playable it is on a FX 5200. :D I'll test it after i pick up the x700 Pro 256MB from BB in a while. I don't want to open my HL2 box until i'm sure i'll use the $20 off ATI coupon.
 
so wait, if I understand this correctly...

valve are a bunch of assholes because the coded the game in fp24 which the fx series cannont do, so the cards are defaulted to fp32 which will obviously increase the performance hit. But they run fp16 very well (of course because it's not as good as fp32) and the difference between fp16 and fp24 is like nothing. There was no way they could've coded it so the fx cards were at fp16 by default? or would that have ruined their "should've went with ati like we said cause we're dicks" moto?
 
This doesn't necessarily mean there was an ulterior motive at Valve.

Consider: You're about to release a product, and you want to optimize performance on *both* types of cards. One way to do this is to determine which card is in use, and adjust the engine accordingly.

They may have simply made a mistake with the nVidia optimizations. While the ATi optimizations may run faster on the nVidia cards than the default nVidia optimizations, it doesn't mean:

1) Valve intentionally crippled nVidia cards
2) There may not also be problems with the ATi optimizations when used on the nVidia cards

Also consider that the drivers Valve used during development for both cards have changed -- sometimes substantially -- since both development and release. This could have introduced problems with the optimizations as implemented.

Either way, I'm sure a patch for nVidia users will be forthcoming. I'm actually surprised this is the first I've heard of it. I'd think that if there was so much visual corruption on Valve's flagship title on one of the most popular cards available, there would have been more of an uproar prior to now. (NOTE: This says more about me than it does the problem, so don't take offense).
 
either way..the FX series was still poo


*I'll take that award too with xenogears* ;)

And Valve doesn't suck because of this, they suck because of Steam
*great game though*

much like the FX series...being poo
 
um...btw..the link to the article...goes to Bestbuy deal thread.

any correction need?
 
all of this fussing back & forth...be thankful you have the game, and it runs...myself, i'm stuck with a 9200SE if and when i decide to sign my life over to valve/steam...
 
Hey, uh, what are the shaders running at when you select low shader quality in advanced video settings? :rolleyes:
 
Badger, I only see the Presi guy on nvidia forums being the only person doing it and I dont see where on Guru3D that someone did it.

Edit: NM, i see the Guru3D thread now. So they were able to get it to run faster when forcing FP16 and using the dx9 path and 9800 device id.

pxc said:
Look at the first post. I can't test because I have the steam version. :mad: I explained why it doesn't work a few posts above.

I really want to see how playable it is on a FX 5200. :D I'll test it after i pick up the x700 Pro 256MB from BB in a while. I don't want to open my HL2 box until i'm sure i'll use the $20 off ATI coupon.

I did and it doesnt seem that even dderidex did it. Either way you can do it on the steam version, heck I did it just a while ago to see what it would do. Just do this, for the dll path put the "\half-life 2\bin" and select the exe as "\half-life 2\hl2.exe". Just startup steam manually and do everything else the guy suggested. To make sure you did it correctly, the driver name for that benchmark should be "3D-Analyze v2.2 - http://www.tommti-systems.com" in the sourcebench.csv file that is generated if you did a timedemo.
 
This is not Valve's ISSUE .. it's nVidia's issue ... if a game is running at 24fp .. and the card can run at 16/32 fp then if 16 will perform better then use 16 unless 32 is needed ...

IF this game has to be run at 24fp then 32fp is the only option that card has ... there's no arguing here ...
 
Ok , I could use some 5900XT benckmarks comparision ...........

Anyone ?
 
This is not Valve's ISSUE .. it's nVidia's issue ... if a game is running at 24fp .. and the card can run at 16/32 fp then if 16 will perform better then use 16 unless 32 is needed ...

IF this game has to be run at 24fp then 32fp is the only option that card has ... there's no arguing here ...



You are missing a point fool, there is no need to slow down the nvidia cards to make ati cards look better. We all payed money for hl2 regardless which card you have, and people deserve to have the right stuff for either card... In this case I sentence Valve to death by lethal injection for what they done to gaming community... :D
 
HeavenX said:
You are missing a point fool, there is no need to slow down the nvidia cards to make ati cards look better.


fool ? :eek:

They didnt slow down the nVidia cards ... nVidia decided to go a certain way ... and they suffered ... running at 24fp is more efficient that running at 32fp when 32fp is clearly NOT needed
 
While this certainly isn't proof of some Valve conspiracy (and could easily be an honest mistake, esp. with a large, complex code base like a game engine), it does lead us to wonder, both because of the amount of time this game has been in development and because of the amount of cash dumped into Valve's lap by ATi.

I have a feeling that nVidia's next driver release will feature great performance increases for the FX line in Half-life 2.

Question: at what precision does the X800 run the shaders?
 
HeavenX said:
You are missing a point fool, there is no need to slow down the nvidia cards to make ati cards look better. We all payed money for hl2 regardless which card you have, and people deserve to have the right stuff for either card... In this case I sentence Valve to death by lethal injection for what they done to gaming community... :D

Maybe you shouldn't call the guy a fool until you also know what you're talking about. They didn't slow anything down since the cards don't run in DX9 mode in the first place.
 
half life 2 looks sweet on my gt 6800 card.
its looks even better then the pic thats on the frist post
 
It is Valves issue and their fault from maiming nVidia's FX cards.

.............. alright, let the anti-nVidia / pro-Valve attacks begin.........

But, before anyone does, try to answer this:

We're supposed to really believe that one techie guy (credit given to Presi at Guru3d) messing around with HL2 and his FX card on his home computer, figured out how to make the game run in DX9 with good performance........ yet no one out of Valves entire development team could do the same? :rolleyes:
 
Badger_sly said:
...no one out of Valves entire development team could [figure out how to make the game run in DX9 with good performance]? :rolleyes:

Larger oversights have been made in the past.

Like the saying goes, never ascribe to malice what can be attributed to incompotence.

Many are suspicous, though, because there was a small matter between Valve, ATi, and $6 million.
 
Badger_sly said:
It is Valves issue and their fault from maiming nVidia's FX cards.

.............. alright, let the anti-nVidia / pro-Valve attacks begin.........

But, before anyone does, try to answer this:

We're supposed to really believe that one techie guy (credit given to Presi at Guru3d) messing around with HL2 and his FX card on his home computer, figured out how to make the game run in DX9 with good performance........ yet no one out of Valves entire development team could do the same? :rolleyes:

The point that a lot are missing (especially ruined, whom is going around to multiple forums posting ridiculous statements on this) is that the performance boost is from the default performance after they have *forced* the game to run into dx9 mode, as well as using the 9800 device id. Performance in that mode was already unbearably low, most likely due to the fact many of the shaders were running in full precision mode, which is why forcing partial precision improved performance. The thing I'm wondering is, how does performance after doing that compare to the default nv3x mode for hl2. If it compares good and the iq gain is worth the performance loss, then the question people should be asking is "why did they remove the mixed-mode?".
 
Badger_sly said:
But, before anyone does, try to answer this:

We're supposed to really believe that one techie guy (credit given to Presi at Guru3d) messing around with HL2 and his FX card on his home computer, figured out how to make the game run in DX9 with good performance........ yet no one out of Valves entire development team could do the same? :rolleyes:

if they were partially funded by ATi (as is implied by someone earlier in the thread...) why would they even bother? if ATi found out, they'd yank the $$$.
 
p3n said:
the fx 5*** sucked, get over it

p3n00b... if you recall... the fx5950's were neck-and-neck with the radeon 9800's... i personally planned on getting an albatron fx5950 ultra (second best of the pack) before the 6800's came out- and i loathe to h*** going with anything worse than second-best.
 
Status
Not open for further replies.
Back
Top