Bit-tech tests Bioshock

Matas

Limp Gawd
Joined
Feb 16, 2006
Messages
438
http://www.bit-tech.net/gaming/2007/08/30/bioshock_gameplay_graphics_and_performance/8
As gamespot already reported, HD2900 XT is able to overtake GTX and Ultra in DX9 mode (however GTX/Ultra have slighty higher min fps). So, we can call it as a tie. In DX10 mode, HD2900 XT performance terribly drops. It's not clear if it's Catalyst driver bug, or HD 2900 XT hasn't enough power to handle DX10. Anyway, Firingsquad tests were talking true.
 
For me personally, these latest bench marks don't mean much. When I first started playing Bioshock, I was playing at 1920X1200 and even at that resolution the jaggies still bothered me. So, I downloaded the 163.44 beta drivers, got AA working correctly, and now I'm happy as a clam. So, any benchmarks that test without AA is worthless to me. Come on, I spent over $600 for my GPU to not use AA? Don't think so.
 
Bioshock needs an update to handle correctly AA under DX10/DX9. No patch ---> no tests with AA.
 
Gamestop was correct not firing squad, no surprise there.
Lets see if they get the drivers up to par on a 4 month old video card. Oh wait, it's a standard clock HD 2900XT vs OC core for the 320 and 640, no wonder there is an "extra" gap in the frame rate for DX10. Its times like this when the HD 2900XT must be OC using CCC for similar comparison.
* Nvidia GeForce 8800 Ultra 768MB -- operating at 612/1500/2160MHz using Forceware 163.44 beta
* Nvidia GeForce 8800 GTX 768MB -- operating at 575/1350/1800MHz using Forceware 163.44 beta
* BFGTech GeForce 8800 GTS OC 640MB -- operating at 550/1300/1600MHz using Forceware 163.44 beta
* ATI Radeon HD 2900 XT 512MB -- operating at 742/1650MHz using ATI Catalyst 8.401 BioShock hotfix.
* BFGTech GeForce 8800 GTS OC 320MB -- operating at 550/1300/1600MHz using Forceware 163.44 beta

Oh my goodness, stock clocks are still stock clocks. Regardless of what you use to OC, an OC is an OC. Plain and simple, I swear. Overdrive, Rivatuner, ATiTool, nTune, it's all the same shit. "Supported" is irrelevant, the end outcome is the same.

This is why the best way to benchmark is to simply go by stock clocks, there are many methods one could use to OC from air to LN...
 
I'm sure it MUST be an ATi driver bug, it just has to be...:rolleyes:. They just released the hotfix for the game, you're not going to sqeeze water from a rock. Again, no AA so the review is only half way done imo (and to others). DX10 performance mirrors FS' findings not surprisingly.
 
Better scores on Nvidia cards on an Nvidia-branded game. Predictible :)

Way too obvious.:)

[rant]

Nvidia sure has their hand up a lot of game company's asses these days.
I remember when you could buy a game that wasn't optimized for ANY company.

"Gee I wanna play HL3 with somewhat decent frames but i CANT because its not optimized for my card and have to BUY A NEW $600 VIDEO CARD" Only cause company A bought out the coders who code the game.

Monopolizing bastards. I bet if we stopped buying cards and games for awhile they'd get the hint to stop that shit.

What are we going to have to start doing, switching out video cards in our rigs just to play different games? WTF?

Game companys who favor one video card company over another can GARGLE ON MY YAMBAG.

Truly pathetic.

[/end rant]
 
Way too obvious.:)

[rant]

Nvidia sure has their hand up a lot of game company's asses these days.
I remember when you could buy a game that wasn't optimized for ANY company.

"Gee I wanna play HL3 with somewhat decent frames but i CANT because its not optimized for my card and have to BUY A NEW $600 VIDEO CARD" Only cause company A bought out the coders who code the game.

Monopolizing bastards. I bet if we stopped buying cards and games for awhile they'd get the hint to stop that shit.

What are we going to have to start doing, switching out video cards in our rigs just to play different games? WTF?

Game companys who favor one video card company over another can GARGLE ON MY YAMBAG.

Truly pathetic.

[/end rant]

There's an ole saying about situations like this:

If you cater to a fan you earn his gratitude but, if you cater to the masses you earn a living

You can always tell when this applies when you look at how popular a game/etc is. Honestly, (besides the dx10 results) I believe that UT3 engine is fairing well with the PC and xbox 360 community. Is there some sort of special coding with G80 and DX10, who knows? However, does it matter since the only difference between DX9 and DX10 is smoke and lighting effects, not really in the grand scheme of things.
 
Way too obvious.:)

[rant]

Nvidia sure has their hand up a lot of game company's asses these days.
I remember when you could buy a game that wasn't optimized for ANY company.

"Gee I wanna play HL3 with somewhat decent frames but i CANT because its not optimized for my card and have to BUY A NEW $600 VIDEO CARD" Only cause company A bought out the coders who code the game.

Monopolizing bastards. I bet if we stopped buying cards and games for awhile they'd get the hint to stop that shit.

What are we going to have to start doing, switching out video cards in our rigs just to play different games? WTF?

Game companys who favor one video card company over another can GARGLE ON MY YAMBAG.

Truly pathetic.

[/end rant]

theres a big problem with your mentality... ATI has a program like that as well :rolleyes: and some how i dont see you completely stopping your purchases of games because of it...

and there is NOTHING stopping ATI from optimizing for the game. its a two way street, dont blame nVidia. blame ATI for not stepping up the plate and getting good games in to thir program...

and before you jump and call me a fanboy, I havent played it on the PC, nor will I. played it on 360, and the lack of multiplayer made it a no no for me on PC...

[edit] and for that matter, I dont really trust bit tech as a review website anymore... not since they dropped theyre apples and oranges review process for a more lazy less indepth one...
 
Game companies aren't going to build their games specifically for a single GPU or GPU company, simply because all that would net them is the alienation of a large chunk of their potential customer base.
 
So thats three sites now, that show the 2900XT being the faster overall card in Bioshock in DX9. Im sure some NV fans still wont want to believe it though.

ATi really needs to get a new driver, or hotfix out for Vista though. Although, the market is pretty slim for people who have Vista, and a 2900XT overall. Still, its a new DX10 title, and they need more frames.
 
Nvidia sure has their hand up a lot of game company's asses these days.
It's the other way around, actually. Involvement in TWIMTBP is zero cost, and the benefit is that developers don't have to trudge through generic or vendor-specific shader optimizations, though obviously TWIMTBP won't waste time with competitor-specific optimizations.

I remember when you could buy a game that wasn't optimized for ANY company.
AMD issues game-specific optimizations with every driver release. Why exactly do you see so many "14% improvement in X" in their driver releases notes? Is it because or vendor-specific optimization or not because of vendor-specific optimization? AMD does a kickass job with post-release optimizations, but they just need to get on their game with respect to developer relations.

NVIDIA usually just ends up with the head start and their logo on the box. They have the better developer relations program, so that's something of a given. You pay for TWIMTBP when you buy a NVIDIA card.

What are we going to have to start doing, switching out video cards in our rigs just to play different games? WTF?
Since when does a TWIMTBP sticker indicate that developers optimized only for NVIDIA hardware?

There are certainly titles out there that perform as well or better on AMD hardware even though they're TWIMTBP-branded games (especially so in the last generation), which doesn't lend much credence to your fears.
 
Since when does a TWIMTBP sticker indicate that developers optimized only for NVIDIA hardware?

That's not exactly what I meant. What I was asking was whether or not developers optimized only for NVIDIA hardware and not for AMD hardware. Certainly, the sticker isn't evidence to suggest that developers are completely foregoing the process of AMD-specific optimization solely because of their involvement with TWIMTBP.

A large part of shader optimization is non-vendor-specific by nature.

We asked Roy what the report contained, in general terms. “The report is quite simple – we want to make the gaming experience as good we possibly can on all of our hardware. We give the developers a list of per-GPU optimisations that they can make to their code to help improve the experience at any given spec – in our opinion, this is vital for any developer that is serious about making a great PC game.”
Source
 
That's not exactly what I meant. What I was asking was whether or not developers optimized only for NVIDIA hardware and not for AMD hardware. Certainly, the sticker isn't evidence to suggest that developers are completely foregoing the process of AMD-specific optimization solely because of their involvement with TWIMTBP.

A large part of shader optimization is non-vendor-specific by nature.
 
theres a big problem with your mentality... ATI has a program like that as well :rolleyes: and some how i dont see you completely stopping your purchases of games because of it...

and there is NOTHING stopping ATI from optimizing for the game. its a two way street, dont blame nVidia. blame ATI for not stepping up the plate and getting good games in to thir program...

and before you jump and call me a fanboy, I havent played it on the PC, nor will I. played it on 360, and the lack of multiplayer made it a no no for me on PC...

[edit] and for that matter, I dont really trust bit tech as a review website anymore... not since they dropped theyre apples and oranges review process for a more lazy less indepth one...

I'm not out to call anyone a fanboy. :p I'm not one either as i have jumped between ATI and nVidia over the years. Lately it has been ATi cards because they were better performers, and I think their image quality is better but I know that has changed with the 8800 series.
My sole complaint is that the last game I saw that had ANYTHING ATi on it was HL2. Obviously it ran better on ATi hardware and the benchmarks proved it at the time.
But I'd say all of the major releases of late has 'nVidia the way its meant to be played' all over the opening company intro's.
What that tells me is that the company that built the game favors nvidia hardware for their titles.

Some people need to realize that instead of arguing who is better they should actually want the cards to perform the same or similar to keep the prices down.
Kind of like M$ who knows there isn't much competition to its OS so it CAN jack the prices to astronomical proportions because they know everyone uses winblows.

Oh well sorry for the rant:D
 
I'm not out to call anyone a fanboy. :p I'm not one either as i have jumped between ATI and nVidia over the years. Lately it has been ATi cards because they were better performers, and I think their image quality is better but I know that has changed with the 8800 series.
My sole complaint is that the last game I saw that had ANYTHING ATi on it was HL2. Obviously it ran better on ATi hardware and the benchmarks proved it at the time.
But I'd say all of the major releases of late has 'nVidia the way its meant to be played' all over the opening company intro's.
What that tells me is that the company that built the game favors nvidia hardware for their titles.

Some people need to realize that instead of arguing who is better they should actually want the cards to perform the same or similar to keep the prices down.
Kind of like M$ who knows there isn't much competition to its OS so it CAN jack the prices to astronomical proportions because they know everyone uses winblows.

Oh well sorry for the rant:D

I see TWIMTBP as more of an "it will run properly" guarantee rather than a ZOMG ITLL RUN LIKE CRAP ON ATI! guarantee... just look at lost planet lmao ran equally junky on both, but ugly as sin on ATI :O

and to the person just up above this post a bit who highlighted our hardware in red, youre twisting that quote to suit yourself methinks... its more of a guarantee that it will run well on all of their hardware, and youre trying to make it sound like it will ONLY run well on all of their hardware. brand loyalty does bad things...

and Dunan, sorry if it sounded like I was trying to call you a fanboy, i wasnt :p
 
You can always tell when this applies when you look at how popular a game/etc is. Honestly, (besides the dx10 results) I believe that UT3 engine is fairing well with the PC and xbox 360 community. Is there some sort of special coding with G80 and DX10, who knows? However, does it matter since the only difference between DX9 and DX10 is smoke and lighting effects, not really in the grand scheme of things.

I just love it...When the HD 2900 XT came out and its numbers were low: "Oh it doesn't matter. It's DX9. This card will smoke everything in DX10". And now that there are some games that use DX10, and the XT doesn't exactly perform that good in them (except for CoJ), it's "DX10 doesn't matter, it's only smoke and lighting effects".

It's very funny!
 
I just love it...When the HD 2900 XT came out and its numbers were low: "Oh it doesn't matter. It's DX9. This card will smoke everything in DX10". And now that there are some games that use DX10, and the XT doesn't exactly perform that good in them (except for CoJ), it's "DX10 doesn't matter, it's only smoke and lighting effects".

It's very funny!

There's like 5 dx 10 games. I don't we've seen enough dx10 games to judge on current generations. It's obvious ATI has driver issues with vista. Only time will tell.
 
There's like 5 dx 10 games. I don't we've seen enough dx10 games to judge on current generations. It's obvious ATI has driver issues with vista. Only time will tell.

I don't think you understood where I was trying to get, but it's probably for the best.
 
There's like 5 dx 10 games. I don't we've seen enough dx10 games to judge on current generations. It's obvious ATI has driver issues with vista. Only time will tell.

Not only that, he completely ignores the fact that the gist of my post is that the difference between DX9 and DX10 of Bioshock is minuscule. I even provided a link to justify my reason for it. You see, what he did has a name and it's called "spin doctoring". He's in fact spinning it differently other then what I implied about Bioshock in DX9 to DX10 to re-interpret it as a "spin" for ATI vs. Nvidia. Sometimes, someone can spin someone else's post because they lack comprehension to properly understand what's either implied or actually stated. However, regardless if it's lack of comprehension to what is stated or because they cannot provide no rebuttal it changes nothing regarding what I've posted earlier regarding Bioshock in DX9 vs DX10. The comparison SS from firingsquad reflect that the differences are minuscule to none depending on what you look at.
 
i think it is interesting that the 2900xt may finally run with some driver releases
 
I can understand how Eastcoast can view his response as such from his P.O.V. If that's true than that's fine (I'm not in the "conflict" between you two), but I do see some truth to his statement in general. Lots of people quoted, said, or implied as much. I do agree that we haven't seen enough of DX10 to make such a definite conclusion at this time (Crysis should provide that). However, this never-ending "driver business" with ATi does make you wonder that it's futile to keep saying "It must be the drivers!" after months and months. We all know that drivers play a moderate role at best, in overall performance (better AA perf. is out of ATi's reach on this card imo).

Having said that, Nvidia's drivers aren't up to their best either. They have improved greatly since this last winter surely, but they can still be improved -no doubt about it. If anyone wants to bring up "optimizations" provided by TWIMTBP and GITG programs, talk to the respective companies who run them. ATi should step up (big time) to provide better support/hardware to get more of those game, developer, and publisher deals over Nvidia who has almost 5x the clients.
 
I can understand how Eastcoast can view his response as such from his P.O.V. If that's true than that's fine (I'm not in the "conflict" between you two), but I do see some truth to his statement in general. Lots of people quoted, said, or implied as much.

As I stated before, the gistt of my post is that the PQ of Bioshock between DX9 and DX10 is minuscule if you reviewed firingsquad's comparison shots. Which ultimately has nothing to do with how you or anyone else interprets the opinions of others months ago regarding R600 vs G80. Harboring grudges based on how you felt through your own interpretion of other's opinions from months ago as a response to Bioshock's PQ by definition is OT to the very post you are responding to. That generic response is not the answer to why the R600 has a decent frame rate in that game.
 
There is a big difference in comprehending what I posted and my POV (which can be interrupted as an opinion). As I stated before, the jest of my post is that the PQ of Bioshock between DX9 and DX10 is minuscule if you reviewed firingsquad's comparison shots. Which ultimately has nothing to do with how you or anyone else interrupts the opinions of others months ago regarding R600 vs G80. Harboring grudges by spinnning it based on how you felt through your own interruption of other's opinions from months ago as a response to Bioshock's PQ by definition is OT to the very post you are responding to. That generic response is not the answer to why the R600 has a decent frame rate in that game.

Did you look at the second half of the SS differences? There is a big difference in the way the smoke/fog/shadows look.
 
Both cards perform extremely well with Bioshock. Whats everyone bitching about? Just play the damn game and ENJOY IT!!! :rolleyes:
 
Did you look at the second half of the SS differences? There is a big difference in the way the smoke/fog/shadows look.
Did you actually read my previous post or did your reading only start at the aforementioned? Take a look and you will see that I addressed this (even though my wording is not exact) already.
 
There is a big difference in comprehending what I posted and my POV (which can be interrupted as an opinion). As I stated before, the jest of my post is that the PQ of Bioshock between DX9 and DX10 is minuscule if you reviewed firingsquad's comparison shots. Which ultimately has nothing to do with how you or anyone else interrupts the opinions of others months ago regarding R600 vs G80. Harboring grudges by spinnning it based on how you felt through your own interruption of other's opinions from months ago as a response to Bioshock's PQ by definition is OT to the very post you are responding to. That generic response is not the answer to why the R600 has a decent frame rate in that game.

I think the frame-rate that the 2900xt has is plausible, nothing weird about the results to me in this benchmark. It should be pretty well known that the 2900xt can in many, if not most, cases take the top spot when no AA is applied in a benchmark. DX10 seems to be another question at this point, so far, and that's without AA. I'm not harboring grudges, but if you think so that's fine lol. :rolleyes:

Of course in CoJ, the card does decently well compared to the competition with 4xAA. CoJ isn't all that great with averages in the 20 if you want to play at max settings at any type of real resolution with a little bit of AA. I'd place this game in the same patch-up-basket with LP, neither are good ideas of what we can expect of DX10 visually and performance wise. Bioshock doesn't appear to be using a whole lot of DX10 from comparisons, yet it still looks very nice. I honestly don't think we'll get a real clue of what to expect until Crysis hits and then we can say what card does what with more assurance in reference to DX10 (even though in DX9 it tops everything else visually). Oh and let me know if this post is still generic, just stating my opinion on what Silus said in the post that Marvelous quoted. Thanks. :p
 
Not only that, he completely ignores the fact that the jest of my post is that the difference between DX9 and DX10 of Bioshock is minuscule. I even provided a link to justify my reason for it. You see, what he did has a name and it's called "spin doctoring". He's in fact spinning it differently other then what I implied about Bioshock in DX9 to DX10 to re-interrupt it as a "spin" for ATI vs. Nvidia. Sometimes, someone can spin someone else's post because they lack comprehension to properly understand what's either implied or actually stated. However, regardless if it's lack of comprehension to what is stated or because they cannot provide no rebuttal it changes nothing regarding what I've posted earlier regarding Bioshock in DX9 vs DX10. The comparison SS from firingsquad reflect that the differences are minuscule to none depending on what you look at.

Now you know something about that guy. Don't get too self involved. These corporations even pay people to do the PR work for them in forums. Nvidia was caught red handed a while ago if you keep up with PC news. I'm sure ATI does somethings too. I have no idea.
 
I can understand how Eastcoast can view his response as such from his P.O.V. If that's true than that's fine (I'm not in the "conflict" between you two), but I do see some truth to his statement in general. Lots of people quoted, said, or implied as much. I do agree that we haven't seen enough of DX10 to make such a definite conclusion at this time (Crysis should provide that). However, this never-ending "driver business" with ATi does make you wonder that it's futile to keep saying "It must be the drivers!" after months and months. We all know that drivers play a moderate role at best, in overall performance (better AA perf. is out of ATi's reach on this card imo).

Having said that, Nvidia's drivers aren't up to their best either. They have improved greatly since this last winter surely, but they can still be improved -no doubt about it. If anyone wants to bring up "optimizations" provided by TWIMTBP and GITG programs, talk to the respective companies who run them. ATi should step up (big time) to provide better support/hardware to get more of those game, developer, and publisher deals over Nvidia who has almost 5x the clients.

It's true the drivers for both parties aren't up to snuff. But when you see 50% difference between winxp dx9 performance and Vista dx9 than obviously something is wrong. 8800 series perform very similar across the line while 2900xt doesn't. That's what I see.
 
I think the frame-rate that the 2900xt has is plausible, nothing weird about the results to me in this benchmark. It should be pretty well known that the 2900xt can in many, if not most, cases take the top spot when no AA is applied in a benchmark. DX10 seems to be another question at this point, so far, and that's without AA. I'm not harboring grudges, but if you think so that's fine lol. :rolleyes:

Of course in CoJ, the card does decently well compared to the competition with 4xAA. CoJ isn't all that great with averages in the 20 if you want to play at max settings at any type of real resolution with a little bit of AA. I'd place this game in the same patch-up-basket with LP, neither are good ideas of what we can expect of DX10 visually and performance wise. Bioshock doesn't appear to be using a whole lot of DX10 from comparisons, yet it still looks very nice. I honestly don't think we'll get a real clue of what to expect until Crysis hits and then we can say what card does what with more assurance in reference to DX10 (even though in DX9 it tops everything else visually). Oh and let me know if this post is still generic, just stating my opinion on what Silus said in the post that Marvelous quoted. Thanks. :p

Ok, you change gears and state that Bioshock doesn't use AA. The UT3 engine uses deferred shading which doesn't use AA (at least for DX9)...brilliant! Then you change gears and discuss COJ which isn't all that great (according to you). Then you change gears again for the ultimate "wait for Crysis" spin. Out all that you said, instead of being direct you simply add some fluff. In order to respond to my post you discuss no AA, COJ and Crysis? OK, no problem...at least you are not showing some grudge, LOL. Just stating my opinion on your opinion to someone else's opinion that had nothing to do with his response to my post :D
 
Anyway my 8600gts plays this game beautifully. Image quality for 8x00 series really improved over 7x00 series. Looks very similar to my previous 1900gt. I overclocked my card to 775/1150. 3dmark score was higher by 1100 points but I know 3dmark can sway both ways by newer technology and not so much in real games.

I'm playing on high 1440x900 with 16x AF. Completely smooth. I need to do some more testing of course to make a complete analysis of this card since I just got it. So far so good. :)
 
If that's what you think I'm trying to do to this thread then you're entitled to think that, no worries here mate. I guess the first two sentences (in reference to Bio') in that post aren't positive about the 2900xt? My second paragraph was about DX10 and comparisons in it in general across the board btw, so you can skip if it needed.


Then you change gears and discuss COJ which isn't all that great (according to you).

Hey you can at least finish outlining that sentence of mine. If you read on it says, "CoJ isn't all that great with averages in the 20 if you want to play at max settings at any type of real resolution with a little bit of AA". Doesn't mean I don't think the game is fun, I've played it, but if you max it out I don't think it's really playable FPS wise unless you tone a few things down.

If you want to reference Bioshock and DX9/DX10 in XP and Vista, I haven't seen any benchmarks of the 2900xt with Bioshock in XP getting drastically different FPS vs. DX9 in Vista (see bit-tech review). With no vsync in DX9, it still scores in the upper 70s as it's average. As for the DX10, I don't know what to tell ya about the large performance drop. You'd guess that a hotfix would've solved such a think from happening, unless it's just a temporary fix.

You can even go back and look at FS' benchmarks (added 1GB model 2900s). There results are similar, but in stead of a 50% decrease, they show a 30% fall. Different sites, different reviews. [H]'s Bioshock review will probably show a similar trend. Even the old LP review (w/ 4xAA) shows a noticeable drop when switching to DX10 mode, even with older drivers, a similar outcome. If drivers really are the culprit for this trend, then we'll wait and see what they can do. If not, that's who lot of time spent.

EDIT #2: Oh and why do you have this fascination of "spins" lol? You have my comparisons/examples mistaken for some kind of attempt to "change gears" and/or put a "spin" a particular subject. That was just FYI, but I'm sure you'll keep posting that ;).
 
So at 1680 x 1050 on a GTS 640 mb, it gets an average of 58 FPS and with a GTX it gets 71 fps? I wonder which card I should get now :eek:
 
If that's what you think I'm trying to do to this thread then you're entitled to think that, no worries here mate. I guess the first two sentences (in reference to Bio') in that post aren't positive about the 2900xt? My second paragraph was about DX10 and comparisons in it in general across the board btw, so you can skip if it needed.
If there is no worries there is no need to explain. It is what it is...



Hey you can at least finish outlining that sentence of mine. If you read on it says, "CoJ isn't all that great with averages in the 20 if you want to play at max settings at any type of real resolution with a little bit of AA". Doesn't mean I don't think the game is fun, I've played it, but if you max it out I don't think it's really playable FPS wise unless you tone a few things down.
Fine, so there is an excuse. Not everyone averages 20...your personal experiences with another game has no relevance regarding this thread. Nor did I bother asking for that matter.

If you want to reference Bioshock and DX9/DX10 in XP and Vista, I haven't seen any benchmarks of the 2900xt with Bioshock in XP getting drastically different FPS vs. DX9 in Vista (see bit-tech review). With no vsync in DX9, it still scores in the upper 70s as it's average. As for the DX10, I don't know what to tell ya about the large performance drop. You'd guess that a hotfix would've solved such a think from happening, unless it's just a temporary fix.
LOL, really? If you haven't seen it there is no need to comment on it. There is no point made here. Also, I never implied nor suggested a hotfix. :p And if it was fixed in this manner it's not temporary (in the context in which you imply), LOL.

You can even go back and look at FS' benchmarks (added 1GB model 2900s). There results are similar, but in stead of a 50% decrease, they show a 30% fall. Different sites, different reviews. [H]'s Bioshock review will probably show a similar trend. Even the old LP review (w/ 4xAA) shows a noticeable drop when switching to DX10 mode, even with older drivers, a similar outcome. If drivers really are the culprit for this trend, then we'll wait and see what they can do. If not, that's who lot of time spent.
Here is something that confirms what I've been saying all along about you. Please read up on FS's benchmarks. They were completed in a cleared level. But as you plead for H's review to confirm your opinion, your reference to LP has nothing to do with any trend but an attempt to validate your own POV, lol.

EDIT #2: Oh and why do you have this fascination of "spins" lol? You have my comparisons/examples mistaken for some kind of attempt to "change gears" and/or put a "spin" a particular subject. That was just FYI, but I'm sure you'll keep posting that ;).
And what's with the words:
you
the
review
completed
cleared
level
and ths use of other words I use to complete a sentence :D. Wow, fascinating isn' it ;)
 
Not only that, he completely ignores the fact that the jest of my post is that the difference between DX9 and DX10 of Bioshock is minuscule. I even provided a link to justify my reason for it. You see, what he did has a name and it's called "spin doctoring". He's in fact spinning it differently other then what I implied about Bioshock in DX9 to DX10 to re-interrupt it as a "spin" for ATI vs. Nvidia. Sometimes, someone can spin someone else's post because they lack comprehension to properly understand what's either implied or actually stated. However, regardless if it's lack of comprehension to what is stated or because they cannot provide no rebuttal it changes nothing regarding what I've posted earlier regarding Bioshock in DX9 vs DX10. The comparison SS from firingsquad reflect that the differences are minuscule to none depending on what you look at.

Oh you know perfectly well where I was trying to get and what I was calling you indirectly. Using your own words, you can "spin" whatever you want, but many of us know your agenda, since R600 debutted. The last one I remember, were those Falcon Northwest Crossfire benchmarks, which were a joke.

Anyway, in this forum, we already discussed the current DX10 implementation in the games that support it. It's been proven over and over, that none uses DX10s full capabilities, yet these games are what we currently have, meaning DX10 cards will have to prove themselves in these games. The only thing to note, is that a proper evaluation/review, will obviously point out, that it's still early to say that these cards suck in DX10 and it's NOT an indicative of performance in DX10 games across the board, but it IS an indicative for that specific game, which is the purpose of the evaluation anyway.

Also, you (and others like you, in this same thread) have the tendency to choose the specific benchmark that suit your agenda (again reference to those Falcon Northwest benchmarks, among many others), so I'm really not seeing how ones opinion, is more biased than yours, when it comes to deal with the fact that the HD 2900 XT is not a great performer. And I'm not even disputing these Bioshock benchmarks. I just prefer to wait for [H]'s real world gameplay numbers.
 
Now you know something about that guy. Don't get too self involved. These corporations even pay people to do the PR work for them in forums. Nvidia was caught red handed a while ago if you keep up with PC news. I'm sure ATI does somethings too. I have no idea.

How I wish I could get payed, for stating facts...

So is this why you called me fanboy through PM ?
 
How I wish I could get payed, for stating facts...

So is this why you called me fanboy through PM ?

Facts like 2900xt beat 8800gtx in bioshock in winxp and with proper optimization it can also beat it in vista.

I told you your secret is safe with me. :D
 
Back
Top