Valve sucks

Status
Not open for further replies.
rcvirus said:
I'm running 2 identical systems apart from the video cards. One has a
800xt while the latter has a 6800 ultra. The ATi ran HL2 with nary a hiccup
or stutter but the nvidia was brutal until the HL2 stutter fix was released.

I guess if you throw someone a few million in research fee's ;) they
tend to be your friend. Smart move on ATi's behalf. But for the sole
nvidia holder it's was enough to make you see red if you know what I mean. :mad:

I was a die hard nvidiot till the 9700 pro hit the market now I tend to favour ATi.
Graphics in my opinion tend to look richer on ATi. As long as something is playable
fps doesn't concern me too much. Guess I'm a ATi-hole now ;)

Well I have an overclocked Asus FX5700 256MB and to be honest the game ran perfect start to finish in 1024x768 max in game details. Using the Counter strike in game benchmark option I scored an even 60 FPS and thats how it felt through the game for me.
 
I'm tired of people picturing the Doom3 as a perfect engine. I had several issues, mostly audio related.

Both the Doom 3 engine and the Source engine are excellent IMHO, blindly bashing either engine is pathetic unless you know how to make one yourself.

Has anybody played through the game using this "fix"? I've yet to see any proof of any of these claims and still tons of nvidiots are flaming Valve like there's no tomorrow. Quite funny actually....
 
So this explains my subpar cs:s frames even though my system can run a nuclear plant but yet can't run cs:s with good frames! :p
 
Netrat33 said:
You also wont see one x800 series card recommended on the box either yet you do see 6800 series. Bet you didn't notice that :D

the early benchmarks wasn't valve's or ati's fault. That was a stolen copy.

Free voucher is pretty much the same thing as bundling the game with it. It just wasn't out yet. It's much like the 5XXX series cards saying "recommended for doom3" before the game wasn't out.

Dude are you insane? The word "nvidia" or "6800" is nowhere to be found anywhere on my doom3 box. However the word "ati" appears no less than 10 times in my HL2 box which also includes a $20 coupon for any ATI card..."ati" is on the box, on the cds, and on the voucher.

The early benchmarks were entirely valve's fault...the were conducted by valve, they were publicly announced by valve's CEO over a year before the game actually shipped...everyone knows that the game wasn't finished when the code was stolen and the theft had very little to do with the actual delay of the game...you can ignore that fact if you want, as long as you recognize that everyone else knows it...

Bundling and the vouchers are two different things, bundling is done by the card manufacturer (BFG, Asus, etc) whereas the ATi deal came straight from ATi themselves...that's what their $6 million got them. Not mention the coupon in the box for an ATi card. Nvidia advertises their cards as made for doom3...but you don't see id trying to sell their cards for them now do you?
 
fallguy said:
I dont see why people think Valve didnt try to get FX cards to run faster. Gabe said he spend 3 times more time on FX cards over the ATi cards for HL2.

He also said the game would be out last september...
 
I'm really indifferent as to whether this was an oversight on Valve's part or whether it was actual malice fueled by ATI's funding... Not much I can do about it and I'm still gonna play HL2 and buy whatever video card hardware enthusiasts and reviewers recommend (not what a dev might recommend). It's very possible they simply cut corners and didn't bother at all to optimize for the FX series when shipping HL2 out.

However, as a company that's licensing their engine for other games for a substantial profit... I would imagine this is something that Valve would want to be on top of. A larger customer base that can run the games using the engine can only mean more profits, intentionally sabotaging Source performance on FX cards is like shooting themselves on the foot.

Not only will customers with FX cards dislike the fact, but companies looking to license an engine will be aware of issues such as this and it would probably influence their desicion ('sup D3/Carmack)... Bad business for Valve no matter how you look at it. That'll be a heck of a lot more relevant in the end (where the bean-counters are concerned).

You'd hope future games using the Source engine would receive a mixed-mode path to cater to as many customers as possible... You'd think. There is, after all, more than enough time to take care of that kinda thing now that HL2 is out the door.


P.S. Whatever flaws were present in the FX line are irrelevant, they're not that old and the engine will be exposed to them thru many a game, it's worth Valve's time (and money) to review this. If Source wasn't an engine Valve was licensing then it'd matter very little imo, just one game optimized for one card. Not every gamer out there upgrades video cards as often as most of us do, nor should they have to.

Oh and I could really give a flip about the FXs personally, I had a Radeon when they were hot (still got a 9600 on a second box) and I've got a 6800 now. /shrug
 
Impulse said:
I'm really indifferent as to whether this was an oversight on Valve's part or whether it was actual malice fueled by ATI's funding... Not much I can do about it and I'm still gonna play HL2 and buy whatever video card hardware enthusiasts and reviewers recommend (not what a dev might recommend). It's very possible they simply cut corners and didn't bother at all to optimize for the FX series when shipping HL2 out.

However, as a company that's licensing their engine for other games for a substantial profit... I would imagine this is something that Valve would want to be on top of. A larger customer base that can run the games using the engine can only mean more profits, intentionally sabotaging Source performance on FX cards is like shooting themselves on the foot.

Not only will customers with FX cards dislike the fact, but companies looking to license an engine will be aware of issues such as this and it would probably influence their desicion ('sup D3/Carmack)... Bad business for Valve no matter how you look at it. That'll be a heck of a lot more relevant in the end (where the bean-counters are concerned).

You'd hope future games using the Source engine would receive a mixed-mode path to cater to as many customers as possible... You'd think. There is, after all, more than enough time to take care of that kinda thing now that HL2 is out the door.


P.S. Whatever flaws were present in the FX line are irrelevant, they're not that old and the engine will be exposed to them thru many a game, it's worth Valve's time (and money) to review this. If Source wasn't an engine Valve was licensing then it'd matter very little imo, just one game optimized for one card. Not every gamer out there upgrades video cards as often as most of us do, nor should they have to.

Oh and I could really give a flip about the FXs personally, I had a Radeon when they were hot (still got a 9600 on a second box) and I've got a 6800 now. /shrug

Well put. If Valve really cared about customer base, they'd be all over it by now (engine licensing).
 
Elrein said:
Funny everything i throw @ my X800XTPE runs super sweet :D , im sure if i had a 6800ultra id be saying the same

Get over it and just play the f'n game :p

Is not about just about bitching if game companys do shit like this and think they can get away with it they are wrong. if in fact this is true and the game was coded to not work right in Nvidia they need to know that this kind of BS is not acceptable. This way next time nvidia decided to do the same with another game agains ati they will think twice. this needs to stop weather it is nvidia or ati or the game DEV.
 
fallguy said:
This thread is worlthelss. No facts, no benchmarks, no screenshots to compare.

Just another post crying about HL2 and FX cards, from someone who "doesn't have HL2 and have no intention of supporting that company".

Wake me up when some actual numbers are posted, and the crying has stopped.

Actually, I think I speak for the majority of people on this forum when I say "let him sleep"...
 
^eMpTy^ said:
Dude are you insane? The word "nvidia" or "6800" is nowhere to be found anywhere on my doom3 box. However the word "ati" appears no less than 10 times in my HL2 box which also includes a $20 coupon for any ATI card..."ati" is on the box, on the cds, and on the voucher.

The early benchmarks were entirely valve's fault...the were conducted by valve, they were publicly announced by valve's CEO over a year before the game actually shipped...everyone knows that the game wasn't finished when the code was stolen and the theft had very little to do with the actual delay of the game...you can ignore that fact if you want, as long as you recognize that everyone else knows it...

Bundling and the vouchers are two different things, bundling is done by the card manufacturer (BFG, Asus, etc) whereas the ATi deal came straight from ATi themselves...that's what their $6 million got them. Not mention the coupon in the box for an ATi card. Nvidia advertises their cards as made for doom3...but you don't see id trying to sell their cards for them now do you?


Not to correct you
but my doom3 box has this on mine

SUPPORTED CHIPSETS
8500
9000
9200
9500
9600
9700
9800
GF3
GF4MX
GF4
GFFX
GF6800

as for x800's. maybe they ran outa room. on my box the 6800 is the last one there and after that is no more room
 
[RIP]Zeus said:
Not to correct you
but my doom3 box has this on mine

SUPPORTED CHIPSETS
8500
9000
9200
9500
9600
9700
9800
GF3
GF4MX
GF4
GFFX
GF6800

Ok, you got me there...funny it doesn't mention x800s...
 
XamediX said:
life is gonna suck for you man. gooo nvidia! D3 sucks on my 6800OC but HL2 is great!


So you would rather games publishers get bought out by graphics cards companies to make the game run better on their card. :rolleyes:
 
^eMpTy^ said:
Ok, you got me there...funny it doesn't mention x800s...

i find it funny to...maybe they knew the x800 is nothing more than a hyped up 9800 so they didn't bother
 
fallguy said:
This thread is worlthelss. No facts, no benchmarks, no screenshots to compare.

Just another post crying about HL2 and FX cards, from someone who "doesn't have HL2 and have no intention of supporting that company".

Wake me up when some actual numbers are posted, and the crying has stopped.

i'm actually going to second this. this thread is now 11 pages long. here is a summary of what i have seen so far (and i wholeheartedly admit to contributing my bit to this):

- flames towards ATi
- flames towards nVIDIA
- flames towards VALVe
- flames towards [H] members who either support or loathe any of the above
- meaningless or irrelevant (spam) posts
- one single screenshot

let's see some benchmarks. some actual benchmarks comparing a "fair" fps (halo is typically used) to hl2 with, say, an fx5200, a radeon 9800, a 6800something, and an x800something. some comparative screenshots would be useful as well...

i further say that if these aren't put forth within about the next 10 pages (about 30min) then would a mod lock this thread because it is just pointless bashing? claims that VALVe sux0rs may or may not be true, but they are certainly invalid if no proof is shown.
 
^eMpTy^ said:
So you don't mind being lied to then? No wonder you like ATi so much... ;)


:rolleyes:

"nvidia: Full scene anti aliasing is something we won't be devoting resources to the development of because we prefer higher resoloutions".


bullshit.gif


Exactly, all companies play this game, valve however what was leaked was what they had, big deal the games out now and lived upto expectations.
 
starhawk said:
i'm actually going to second this. this thread is now 11 pages long. here is a summary of what i have seen so far (and i wholeheartedly admit to contributing my bit to this):

- flames towards ATi
- flames towards nVIDIA
- flames towards VALVe
- flames towards [H] members who either support or loathe any of the above
- meaningless or irrelevant (spam) posts
- one single screenshot

let's see some benchmarks. some actual benchmarks comparing a "fair" fps (halo is typically used) to hl2 with, say, an fx5200, a radeon 9800, a 6800something, and an x800something. some comparative screenshots would be useful as well...

i further say that if these aren't put forth within about the next 10 pages (about 30min) then would a mod lock this thread because it is just pointless bashing? claims that VALVe sux0rs may or may not be true, but they are certainly invalid if no proof is shown.

I was just going to post something similar. Interesting if true to say the least. I am just wondering why no one from the staff has tried it or even commented(except kyle laying the smackdown :) ).
 
DaveBaumann said:
Now, given that Valve had already stated that the low end FX series would be treated as DX8, the only boards that they intended for the “Mixed mode” to operate with would be the 5800/5900/5950. If you take a look at the Steam video card stats this constitutes 2.55% of their install base – conversely, 30% of their install base is running DX9 ATI boards that would receive no performance or quality differences from this path.

Given the time for coding and validation within the game environment, especially in a title that has already slipped 14 months further than it should, is the further coding and support requirements worth it for 2.5% of your userbase?

First of all, I apologize for responding to this without reading the rest of the thread first, but I cannot let such a glaring misunderstanding continue for even a second longer than necessary.

Second of all, the so called "mixed-mode," assuming it uses partial precision hints, would benefit the entire 5xxx line, because they all suffer from the same design choice. (Notice I did not say design "flaw." Never will I ever call an advancement of technology in a beneficial direction a "flaw.")

Third of all,
--------------------------------
and most importantly,
--------------------------------
the Steam Hardware Survey at http://steampowered.com/status/survey.html states, albeit not clearly, that ***18.93%*** of Steam users are using this line of video cards.

----------------------------------------------------------------------- :eek:
That means that nearly 1 out of every 5 Steam users is using an NVidia GeForceFX.
----------------------------------------------------------------------- :eek:
 
Must I repeat myself again?



So far we have:

14,249 views

227 posts
~and the most astounding~
0 benchmark results
 
starhawk said:
i'm actually going to second this. this thread is now 11 pages long. here is a summary of what i have seen so far (and i wholeheartedly admit to contributing my bit to this):

- flames towards ATi
- flames towards nVIDIA
- flames towards VALVe
- flames towards [H] members who either support or loathe any of the above
- meaningless or irrelevant (spam) posts
- one single screenshot

let's see some benchmarks. some actual benchmarks comparing a "fair" fps (halo is typically used) to hl2 with, say, an fx5200, a radeon 9800, a 6800something, and an x800something. some comparative screenshots would be useful as well...

i further say that if these aren't put forth within about the next 10 pages (about 30min) then would a mod lock this thread because it is just pointless bashing? claims that VALVe sux0rs may or may not be true, but they are certainly invalid if no proof is shown.

I second this.
 
What I would like to know,is can this 'forcing 16bit precision' help Vampire:Bloodlines users ?
as both games use the Source engine.One of my systems uses a MSI FX5900xt,and I have
been playing the game on a system that uses a fx5900 xt,as well as a system that uses a Asus V9999GE..
 
Mchart said:
So a round ball, has sphere shaped physics, instead of having a box around it.

*snicker* In doom3...the physics is..uh...WACK. You can punch ..say a hamburger and it FLIES across the room and lands like a tray
 
^eMpTy^ said:
Dude are you insane? The word "nvidia" or "6800" is nowhere to be found anywhere on my doom3 box. However the word "ati" appears no less than 10 times in my HL2 box which also includes a $20 coupon for any ATI card..."ati" is on the box, on the cds, and on the voucher.

The early benchmarks were entirely valve's fault...the were conducted by valve, they were publicly announced by valve's CEO over a year before the game actually shipped...everyone knows that the game wasn't finished when the code was stolen and the theft had very little to do with the actual delay of the game...you can ignore that fact if you want, as long as you recognize that everyone else knows it...

Bundling and the vouchers are two different things, bundling is done by the card manufacturer (BFG, Asus, etc) whereas the ATi deal came straight from ATi themselves...that's what their $6 million got them. Not mention the coupon in the box for an ATi card. Nvidia advertises their cards as made for doom3...but you don't see id trying to sell their cards for them now do you?

I'm sure, you don't forget that big nV campaing '6800 preferred card for Doom3' Not so long ago :p
 
Chris_B said:
:rolleyes:

"nvidia: Full scene anti aliasing is something we won't be devoting resources to the development of because we prefer higher resoloutions".


bullshit.gif


Exactly, all companies play this game, valve however what was leaked was what they had, big deal the games out now and lived upto expectations.

I can't believe you even went there...but ok Chris...ask and ye shall receive:

ATI:
"shader replacement is evil, we would never do that"
"application specfic optimizations are cheating"
"dual slot cooling solutions are horrible"
"PS3 is completely unnecessary"
"SLI is a horrible idea"

We all know the score Chris...when nvidia does something questionable it's bad...and when ati does something questionable it's "what everyone is doing"...
 
starhawk said:
i'm actually going to second this. this thread is now 11 pages long. here is a summary of what i have seen so far (and i wholeheartedly admit to contributing my bit to this):

- flames towards ATi
- flames towards nVIDIA
- flames towards VALVe
- flames towards [H] members who either support or loathe any of the above
- meaningless or irrelevant (spam) posts
- one single screenshot

let's see some benchmarks. some actual benchmarks comparing a "fair" fps (halo is typically used) to hl2 with, say, an fx5200, a radeon 9800, a 6800something, and an x800something. some comparative screenshots would be useful as well...

i further say that if these aren't put forth within about the next 10 pages (about 30min) then would a mod lock this thread because it is just pointless bashing? claims that VALVe sux0rs may or may not be true, but they are certainly invalid if no proof is shown.

Count me in...third...

My only questions to all the bashers of Both companies are "why??" "What's the point??" Do you honestly think that by flaming each other about your video card choices in here is actually going to make a difference? This post would be a really good read if it would hold to the authors original topic without the constant interuption of QWERTY punches.
 
Mchart said:
But to bad I allready have more fun playing contra on my NES then I did playing HL2. .

That's not fair

contra kicks many a games ass still today :)
 
-=bladerunner=- said:
I'm sure, you don't forget that big nV campaing '6800 preferred card for Doom3' Not so long ago :p

Hellllllllllooooooooo???

Nvidia marketing nvidia cards makes sense...VALVE marketing ati cards does not make sense...

Valve should be impartial...putting "ATi" all over the box and even the cds and having an ati coupon in the box is just over the line...
 
D-Rock said:
Count me in...third...

My only questions to all the bashers of Both companies are "why??" "What's the point??" Do you honestly think that by flaming each other about your video card choices in here is actually going to make a difference? This post would be a really good read if it would hold to the authors original topic without the constant interuption of QWERTY punches.

D-Rock got 100 kudo points! see similar post for bsoft above for details on how they can be used...
 
D-Rock said:
Count me in...third...

My only questions to all the bashers of Both companies are "why??" "What's the point??" Do you honestly think that by flaming each other about your video card choices in here is actually going to make a difference? This post would be a really good read if it would hold to the authors original topic without the constant interuption of QWERTY punches.

Right because the "why can't we all just get along" posts are so much better...
 
I used 3D-Analyze on my 6600GT to test low precision. The benchmarks scores came out easily with the margin of error (0.1fps), so i'm not even sure it's working. I'll test it later on my FX 5200, which should have a big difference between FP32 and FP16.

3D-Analyze is still horribly buggy, so good luck to anyone else who's running it. And when it crashes, it can leave bad versions of DLLs and EXEs in your HL2 directory (or possibly overwrite the backups with patched/replaced versions). I recommend backing up your steam directory before running that mess.
 
^eMpTy^ said:
Hellllllllllooooooooo???

Nvidia marketing nvidia cards makes sense...VALVE marketing ati cards does not make sense...

Valve should be impartial...putting "ATi" all over the box and even the cds and having an ati coupon in the box is just over the line...


Valve auctioned off the rights to exactly this. ATi outbid nVidia. End of story. Personally, I think it's an intriguing marketing move. Is it necessarily in everyone's best interest? No, not in the long run. Is there something wrong with it? Not really. It is exactly this type of behavior that falls into the business categories of "sponsorship", "partnership" and "co-branding". It's no more nor less than the TWIMTBP campaign by nVidia. Yet for years now ATi proponents have had to sit through (or ESC out of) what is, in essence, an nVidia commerical at the beginning of some of the most popular games in recent years.
 
^eMpTy^ said:
Dude are you insane? The word "nvidia" or "6800" is nowhere to be found anywhere on my doom3 box. However the word "ati" appears no less than 10 times in my HL2 box which also includes a $20 coupon for any ATI card..."ati" is on the box, on the cds, and on the voucher.

The early benchmarks were entirely valve's fault...the were conducted by valve, they were publicly announced by valve's CEO over a year before the game actually shipped...everyone knows that the game wasn't finished when the code was stolen and the theft had very little to do with the actual delay of the game...you can ignore that fact if you want, as long as you recognize that everyone else knows it...

Bundling and the vouchers are two different things, bundling is done by the card manufacturer (BFG, Asus, etc) whereas the ATi deal came straight from ATi themselves...that's what their $6 million got them. Not mention the coupon in the box for an ATi card. Nvidia advertises their cards as made for doom3...but you don't see id trying to sell their cards for them now do you?

Well..once again. I'm not saying it's on Doom3's box... a rare thing indeed. ;) ADMIT IT!!

Uh...I'm really understanding that theft is what hurt Valve releasing it early.

I don't see any difference from Bundling and vouchers. Yes One is in the box I know. But Ati sponsered the game (Just like nvidia does) and they want to push it for their cards (whether true or not much like TWIWMTBP). And YES if nvidia is advertising their cards as the recommended card for Doom3, then YES they are selling there cards for THAT popular game. It's all advertising either way.
 
pxc said:
I used 3D-Analyze on my 6600GT to test low precision. The benchmarks scores came out easily with the margin of error (0.1fps), so i'm not even sure it's working. I'll test it later on my FX 5200, which should have a big difference between FP32 and FP16.

It probably would on the 6600GT, since the GeForce 6-series were basically designed around using FP32 rather than the FX series designed to use FP16 with 32 as an 'extra feature'.

Still, the million dollar question is....what did it LOOK like? On a GeForce 6-series, the performance may not be different going down to 16-bit precision from 32-bit precision simply because the card is designed to do both equally well....but 16-bit precision is 16-bit precision on the FX *OR* the 6-series. Either it will look very different, or it won't.
 
^eMpTy^ said:
I can't believe you even went there...but ok Chris...ask and ye shall receive:

ATI:
"shader replacement is evil, we would never do that"
"application specfic optimizations are cheating"
"dual slot cooling solutions are horrible"
"PS3 is completely unnecessary"
"SLI is a horrible idea"

We all know the score Chris...when nvidia does something questionable it's bad...and when ati does something questionable it's "what everyone is doing"...


Can't recall ati putting shader replacement into the drivers, theres a HACK (ill save you the time of capatilising it like a 2 year old) but its not officially from ati.

They can all be disabled on ati drivers, far as im aware only specific optomisations can be disabled on forceware.

If they sound like a hairdryer they are, if not i dont have a problem

How many games are using ps3? This "must have omgomg get it now" feature sure has had a lot of uses thus far.

Jurys still out on sli.

:)
 
pxc said:
I used 3D-Analyze on my 6600GT to test low precision. The benchmarks scores came out easily with the margin of error (0.1fps), so i'm not even sure it's working. I'll test it later on my FX 5200, which should have a big difference between FP32 and FP16.

3D-Analyze is still horribly buggy, so good luck to anyone else who's running it. And when it crashes, it can leave bad versions of DLLs and EXEs in your HL2 directory (or possibly overwrite the backups with patched/replaced versions). I recommend backing up your steam directory before running that mess.

Now you tell me =/.
 
Status
Not open for further replies.
Back
Top