GTX Owners say Ouch!!

And what are you trying to prove here with those quotes?

Anyways, yes I do avoid some games, usually the bad ones.. I did play a few from that list such as Oblivion, G.R.A.W(which I didn't like) and Company Of Heroes. I might try STALKER out to when I get the chance.

I'm gonna try and find a good deal this November if the GTX/Ultras get any lower or I might just get the new GT/GTS.

I'm sure you came nowhere near "Max" settings in Oblivion, even at low resolutions, not to mention trying to use HDR+AA like other cards can.

7900gs OC: http://www.hardocp.com/article.html?art=MTMwNywzLCxoZW50aHVzaWFzdA==

8800gts OC: http://www.hardocp.com/article.html?art=MTMzOCwzLCxoZW50aHVzaWFzdA==

Regarding AF quality:
http://www.hardocp.com/article.html?art=MTIxOCw2LCxoZW50aHVzaWFzdA==

You see the top, I see the bottom. There is a huge difference in quality for all games.

As for some other games, you can see that your card can barely handle them @ 1280x1024 let alone 1680x1050.

http://www.anandtech.com/printarticle.aspx?i=2975

But enough of these facts, I'm sure you will ignore them anyway.
 
I ran played Oblivion on 1280x1024 with max settings before I upgraded my monitor and it ran really smooth. Every game in the Source engine runs perfectly fine with max settings as well. In fact I need to enable V-sync with Source. Theres many games I played that came out this year and last. Whether it was for one hour or till I beat it, they all ran smooth on both 1280x1024 and 1680x1050. I'm not gonna give you a whole list for your sake because you linking benchmarks would prove nothing to me but I played games like Fear, Warhammer:40k, WoW, Guild Wars, CoH, Gothic 3, and I was in the beta of LoTR:Online. Also my GS is OC'd, which makes me wonder what clock speeds these cards are running at.

HDR+AA isn't something I'd go out of my way to get. In some games its even useless. And about the "superior" AF quality, its just a minor improvement. Some people can't even tell the difference between 4AA and 8AA.
 
I ran played Oblivion on 1280x1024 with max settings before I upgraded my monitor and it ran really smooth. Every game in the Source engine runs perfectly fine with max settings as well. In fact I need to enable V-sync with Source. Theres many games I played that came out this year and last. Whether it was for one hour or till I beat it, they all ran smooth on both 1280x1024 and 1680x1050. I'm not gonna give you a whole list for your sake because you linking benchmarks would prove nothing to me but I played games like Fear, Warhammer:40k, WoW, Guild Wars, CoH, Gothic 3, and I was in the beta of LoTR:Online.

Did you even read the review?!?!

Your card cant handle medium at 1280X1024 in oblivion and your claiming you ran max at 1280X1024?!

BTW the source engine is highly optimized you can run most source games on absolute max with a pentium 4 and a 6600 thats not saying much.

Look man you just keep lying and digging yourself in deeper your card cannot handle Fear, COH or gothic 3 at max settings even without AA and AF at your resolution. Why you keep claiming your card can do these things i dont know. Just face the facts you claimed there was nothing that took advantage of the 8 series when it was released and the fact is there where plenty of games that did just that.

Fact is you claim your card runs all of these games over the last year on 100% max and your full of shit. Thats all there is to it there are over a dozen games mentioned in this thread that where either out when the 8800s where released or came out shortly after that your 7900GS simply cannot run on max. Why you decide to lie and make obvious bullshit claims of running these games at 100% just makes me wonder who your trying to fool...

7900GS running oblivion at max :D:D i needed a laugh.
 
No ones gonna tell me what I saw and played never existed. I'll admit Oblivion wasn't played on my current monitor but it handled fine in 1280X1024 regardless. Fear handled really well on my system. I haven't played some games like STALKER which are really graphically intense but I played Bioshock with 30+ FPS at almost max settings and this is a new game might I add. People who are trying to convince me about things I did or didn't see by showing these random benchmarks are either mad I didn't need an 8800 for the past year or are jealous for whatever reason but it doesn't bother me or prove to me anything. All I was saying since the beginning was I'd be wasting $750 by getting an Ultra when I didn't need it instead of waiting till I actually did need it and the prices went down. It's a win/win situation. Hopefully the 8 series will drop dramatically within the next few months and I can purchase one without regrets.
 
but I played Bioshock with 30+ FPS at almost max settings and this is a new game might I add.

This is my point your claiming your run all these games at 100% max then when you get called out on your bullshit your story changes to "almost max".

Im not mad i just have a thing against people that blatantly lie and make their lies so freaking obvious like you are. Your running around claiming your 7900GS runs all of these games at max when its been proven by all of these benchmarks that is not possible. We have a plethora of proof against your claims and all you have is words.

With that said im not listening to your bullshit anymore. You lied about a few things and was wrong about a few as well and you just ignore these facts and keep going with your nonsense like you where not proven wrong nearly every time you posted something. Not only that its not just me either you have been proven wrong by multiple members here and still dig in your corner by yourself and stick to your bullshit.

I have seen enough you are never going to recognize the facts are against your lies and you will beat this lie into the dirt until the whole lot of us get banned for this argument. I am going to bow out of this and simply put you on ignore i advise you do the same to me then we will never have the displeasure of running into each other again.
 
No ones gonna tell me what I saw and played never existed. I'll admit Oblivion wasn't played on my current monitor but it handled fine in 1280X1024 regardless. Fear handled really well on my system. I haven't played some games like STALKER which are really graphically intense but I played Bioshock with 30+ FPS at almost max settings and this is a new game might I add. People who are trying to convince me about things I did or didn't see by showing these random benchmarks are either mad I didn't need an 8800 for the past year or are jealous for whatever reason but it doesn't bother me or prove to me anything. All I was saying since the beginning was I'd be wasting $750 by getting an Ultra when I didn't need it instead of waiting till I actually did need it and the prices went down. It's a win/win situation. Hopefully the 8 series will drop dramatically within the next few months and I can purchase one without regrets.

You are either A. stupid, B. delusional, or C. a troll.

Im going to tell you what you saw, and it was not playable framerates in oblivion all details maxed at 1280x1024.

I have a 7900 GTX, and it barely handles FEAR with everything up except 2x on AA. You also did not play Bioshock with 30+ FPS at almost max settings.

The Source engine is also a pointless comparison, as its almost 3 years old now, and highly optimized. My 6800 Go in my Pentium M laptop screams through Source.
 
I never lied about anything. In fact I even told everyone I can't handle newer games well like Bioshock. I'm not gonna put you on ignore and no one should be banned, people shouldn't care so much to the point of putting up benchmarks as proof because in my eyes they are from from it. You can believe whatever you'd like to believe but I personally choose to believe what I see.

Wow, looks like nissanztt90 wants to step up from the stupid one word posts. If you read correctly, I did do all that, you can't change it. It looks to me that you have a severe case of bottlenecking on your hands or you just need to OC your comp a bit.
 
Just been made aware to me on another forum that these cards will only work on 1.1 or 2.0 PCI-e boards that's p35 chipset or X38, i presumed backward compatible meant 1.0a as well but it does not ,so all these people getting excited need to slow down a bit.
 
I never lied about anything. In fact I even told everyone I can't handle newer games well like Bioshock. I'm not gonna put you on ignore and no one should be banned, people shouldn't care so much to the point of putting up benchmarks as proof because in my eyes they are from from it. You can believe whatever you'd like to believe but I personally choose to believe what I see.

Wow, looks like nissanztt90 wants to step up from the stupid one word posts. If you read correctly, I did do all that, you can't change it. It looks to me that you have a severe case of bottlenecking on your hands or you just need to OC your comp a bit.

Way to go champ, my 3.0ghz Core 2 im sure is really holding me back in GPU intensive games.

Im not changing anything, just simply telling you that youre wrong. Plain and simple. Your computer adheres to physics, and theres no way a 7900 GS is playing Oblivion with max details at that resolution, with playable frame-rates, and in this case playable can be as low as 15 or 20.

Just been made aware to me on another forum that these cards will only work on 1.1 or 2.0 PCI-e boards that's p35 chipset or X38, i presumed backward compatible meant 1.0a as well but it does not ,so all these people getting excited need to slow down a bit.

I highly doubt that, considering how new p35 and x38 are, and how very few people have it, it would absolutely kill the 8800GT sales. And by kill, i mean mercilessly slaughter, and by mercilessly slaughter, i mean Megatron hunting a schoolbus full of pre-schoolers.
 
I never lied about anything. In fact I even told everyone I can't handle newer games well like Bioshock. I'm not gonna put you on ignore and no one should be banned, people shouldn't care so much to the point of putting up benchmarks as proof because in my eyes they are from from it. You can believe whatever you'd like to believe but I personally choose to believe what I see.

Wow, looks like nissanztt90 wants to step up from the stupid one word posts. If you read correctly, I did do all that, you can't change it. It looks to me that you have a severe case of bottlenecking on your hands or you just need to OC your comp a bit.


I guess this all boil down to Manbearpig feels comfortable running those games with 7900GS, that is fine. When I was running Oblivion with X2 3800+ and 7800GT SLI, I was running with everything max (maybe except AA or AF) at res 1280x1024. I was running everything smooth as well, I didn't check the frame rate but I did notice the frame rate dropped (slowed down) at outdoor scenes. Maybe my average frame rates were 30+, so it might seem fine at that time because I had never experiencing anything faster. So probably Manbearpig is experiencing the same thing because he hasn't experience anything better, so the perception wise he feels comfortable with what he has.

As for COH, I was running X2 3800+ and 7800GT at 1280x1024 everything max as well. MMORPGs tend to have lower hardware requirement for the obvious reason that more people can play on them.

I for one think Manbearpig should run a couple games that he thinks he is running well at everything max at 1680x1050 and give us the average frame rates, so we can put this argument aside.

Playable in my book is ~30FPS+

Yes, we all have different standards to what is acceptable. 30 FPS for adventure game or MMRPG is fine because they got auto targetting or target lock, but for first person shooter like BF2142, you will be the target practice for the others.
 
Sorry but it clearly states it is pcie 2.0 which is not going to work on 1.0.
 
Sorry but it clearly states it is pcie 2.0 which is not going to work on 1.0.

2.0 is backwards compatible with both the bus and the cards.

Loads of info here

PCI-SIG said:
A3:SnipBackward compatibility is retained as existing 2.5 GT/S adapters can plug into 5.0 GT/S slots and will run at the slower rate. Conversely, new PCIe 2.0 adapters running at 5.0 GT/S can plug into existing PCIe slots and run at the slower rate of 2.5 GT/S.

More on this,

PCI-SIG said:
Q5: Then PCIe 2.0 must be backward compatible with PCIe 1.1 and 1.0?
A5: Yes. The PCIe Base 2.0 specification supports both the 2.5GT/s and 5GT/s signaling technologies. A device designed to the PCIe Base 2.0 specification may support 2.5GT/s, 5GT/s or both. However, a device designed to operate specifically at 5GT/s must also support 2.5GT/s signaling. The PCIe Base specification covers chip-to-chip topologies on the system board. For I/O extensibility across PCIe connectors, the Card Electromechanical (CEM) and ExpressModule™ specifications will also need to be updated, but this work will not impact mechanical compatibility of the slots, cards or modules. Currently, the PCI-SIG is defining the PCIe CEM 2.0 specification which has been released to members for review at v0.5. There are currently no plans to adapt the PCIe Mini CEM specification for the faster bit rate as the market need has not yet materialized.

PCI-E will be backwards compatible in both ways, The bus itself will support current PCI-E cards and the current bus will support PCI-E 2.0 cards.

Im also willing to bet that since the bandwidth of the current PCI-E has not even come close to being topped that running a 2.0 card in a current slot will yield virtually no performance difference for now.
 
First he says he gets 30+ fps in bioshock at near max settings, then says his card can't handle newer games like bioshock??? Coherent story ftl. Was that also in dx10 mode on your 7900? :p
 
First he says he gets 30+ fps in bioshock at near max settings, then says his card can't handle newer games like bioshock??? Coherent story ftl. Was that also in dx10 mode on your 7900? :p

I think hes a troll...but its fun to own them none-the-less.
 
Nevertheless, if he is happy with 30 fps in gaming, that is fine with me. Isn't the console games are all 30 fps anyway?

Maybe he is, maybe he isnt, my post was in reference to this:

First he says he gets 30+ fps in bioshock at near max settings, then says his card can't handle newer games like bioshock??? Coherent story ftl. Was that also in dx10 mode on your 7900? :p
 
First he says he gets 30+ fps in bioshock at near max settings, then says his card can't handle newer games like bioshock??? Coherent story ftl. Was that also in dx10 mode on your 7900? :p

Bioshock is a new game that can be played in Dx10 mode and there are many other similar games like it. Did you even read anything about me using Dx10...no??? Reading comprehension ftl. 30fps is playable as a singleplayer game in some genres, like others stated, I'm a sitting duck if I played an FPS online with 30fps. I can barely newer play games online such as UT3, which I'm getting 25-30 avg fps in.

He's a troll..He's not a troll..that words being thrown around like a football these days.
 
Sorry, just curious, in your sentence "Reading comprehension ftl.", what is 'ftl'?

1. For the Love of God

2. Faster than Light

3. F**k t..... l.......

4. Something else
 
Bioshock is a new game that can be played in Dx10 mode and there are many other similar games like it. Did you even read anything about me using Dx10...no??? Reading comprehension ftl. 30fps is playable as a singleplayer game in some genres, like others stated, I'm a sitting duck if I played an FPS online with 30fps. I can barely newer play games online such as UT3, which I'm getting 25-30 avg fps in.

He's a troll..He's not a troll..that words being thrown around like a football these days.

You said there has been no reason for the 8800gts/gtx cards because your 7900gs can handle all the games except for the "newer" ones. I've constantly proved you wrong showing that your card cannot handle a lot of the "newer" games, meaning anything released over a year ago, before the 8 series even came out :rolleyes:

If you want me to believe you can play obilivion on "max" settings @ 1280x1024 with your 7900gs please put up (some numbers), or shut up. The [H] link I posted was also run on a C2D and OCed 7900gs.
 
i suggest everyone starts ignoring Manbearpig starting right now. i think he gets off like this, arguing with people on internetz. ive seen his kind before.
 
You said there has been no reason for the 8800gts/gtx cards because your 7900gs can handle all the games except for the "newer" ones. I've constantly proved you wrong showing that your card cannot handle a lot of the "newer" games, meaning anything released over a year ago, before the 8 series even came out :rolleyes:

If you want me to believe you can play obilivion on "max" settings @ 1280x1024 with your 7900gs please put up (some numbers), or shut up. The [H] link I posted was also run on a C2D and OCed 7900gs.

I just went through your link: http://www.anandtech.com/printarticle.aspx?i=2975

Actually his 7900GS is pretty fast, about the speed of a 8600GT/GTS. Of course in the review is a much faster C2D X6800, but playable especially in F.E.A.R. So at 1280x1024 and nearly everything max, his 7900GS should have frame rate less than the following:

14525.png


14523.png


14524.png


14526.png


14522.png
 
i suggest everyone starts ignoring Manbearpig starting right now. i think he gets off like this, arguing with people on internetz. ive seen his kind before.

I'm arguing with them because they're arguing with me. These people get off of trying to prove others wrong. They'd do anything they can to argue, such as posting essays, researching and linking sites; it's in their nature, they were born with these skills from the start. Many of them haven't mastered the art of "arguing with people on internetz" like I have but they're still young. As they reach adulthood, they begin to think for themselves. At this stage many struggle to survive the harsh competition. They stop relying on their mothers and begin feeding off the "world wide web" as a main resource. At the end, only the strongest survive, leaving the surviving weak to fend for themselves, like the cowards they are. Even then, theres still a long way before they reach their prime state.

I know this 'cause I was one of them..till I rebelled.. once a master at their own game, I am now alone in a world of hate...fending for myself like the weak.
 
I just went through your link: http://www.anandtech.com/printarticle.aspx?i=2975

Actually his 7900GS is pretty fast, about the speed of a 8600GT/GTS. Of course in the review is a much faster C2D X6800, but playable especially in F.E.A.R. So at 1280x1024 and nearly everything max, his 7900GS should have frame rate less than the following:

[IMG*]http://images.anandtech.com/graphs/8600%20followup_042407120453/14525.png[/IMG]

[IMG*]http://images.anandtech.com/graphs/8600%20followup_042407120453/14523.png[/IMG]

[IMG*]http://images.anandtech.com/graphs/8600%20followup_042407120453/14526.png[/IMG]

[IMG*]http://images.anandtech.com/graphs/8600%20followup_042407120453/14522.png[/IMG]

About the speed of a 8600gt wow :rolleyes:

Fear is the only game that looks playable, and thats w/o any AA.

Also, don't hotlink images, its a big no no.
 
Can the weak fend for themselves?


I dunno about you, but my 7900GS Plays oblivion fine at 1440x900 4xAA - I get great framerates. :D
 
Oh, and he was also claming he was using his res of 1680x1050 not 1280x1024 ;)

Yeah, I get great framerates with my 7900GS at that resolution too. Even 1920x1200 I get great framerates. They are so great, it's great.
 
Oh, and he was also claming he was using his res of 1680x1050 not 1280x1024 ;)

actually i thinjk he said he didnt play oblivion at 1680x1050 because of his monitor or something but i can handle 1280x1024 with 4xAA with oblivion fine and i have a 7900gt :confused:

is 7900gt much of a difference then 7900gs?
 
actually i thinjk he said he didnt play oblivion at 1680x1050 because of his monitor or something but i can handle 1280x1024 with 4xAA with oblivion fine and i have a 7900gt :confused:

is 7900gt much of a difference then 7900gs?

You obviously dont have everything set to max then like he was claiming.
 
actually i thinjk he said he didnt play oblivion at 1680x1050 because of his monitor or something but i can handle 1280x1024 with 4xAA with oblivion fine and i have a 7900gt :confused:

is 7900gt much of a difference then 7900gs?

Yes, he did admit that he used 1280x1024 for oblivion, but also said he used maxed settings and it ran fine, while the [H] review says otherwise. And his original statement was that he runs everything at max settings @ 1680x1050 and there was no need for people to buy 8 series cards when they came out, because the 7900gs can run everything great already.
 
Bioshock is a new game that can be played in Dx10 mode and there are many other similar games like it. Did you even read anything about me using Dx10...no??? Reading comprehension ftl. 30fps is playable as a singleplayer game in some genres, like others stated, I'm a sitting duck if I played an FPS online with 30fps. I can barely newer play games online such as UT3, which I'm getting 25-30 avg fps in.

He's a troll..He's not a troll..that words being thrown around like a football these days.

I was asking if you played in DX10 sarcastically. Understanding sarcasm ftl. Also, "can barely newer play games", sentence structure for the BIG lose.

I'm arguing with them because they're arguing with me. These people get off of trying to prove others wrong. They'd do anything they can to argue, such as posting essays, researching and linking sites; it's in their nature, they were born with these skills from the start. Many of them haven't mastered the art of "arguing with people on internetz" like I have but they're still young. As they reach adulthood, they begin to think for themselves. At this stage many struggle to survive the harsh competition. They stop relying on their mothers and begin feeding off the "world wide web" as a main resource. At the end, only the strongest survive, leaving the surviving weak to fend for themselves, like the cowards they are. Even then, theres still a long way before they reach their prime state.

I know this 'cause I was one of them..till I rebelled.. once a master at their own game, I am now alone in a world of hate...fending for myself like the weak.

Aztec61, is that you?

i suggest everyone starts ignoring Manbearpig starting right now. i think he gets off like this, arguing with people on internetz. ive seen his kind before.

Agreed. But you know... catching a compulsive liar in his own string of lies is quite entertaining :D
 
I was asking if you played in DX10 sarcastically. Understanding sarcasm ftl. Also, "can barely newer play games", sentence structure for the BIG lose.

I literally didn't think you were sarcastic because there are people who thought I actually did say my 7 series card is compatible with Dx10 which makes me want to shoot myself. I'll admit though, I do get slightly overwhelmed with all these posts vs..well, me. I don't know how I lasted this long so I expect these things. I'll take the reading comprehension remark back.

Agreed. But you know... catching a compulsive liar in his own string of lies is quite entertaining :D

Call me all you want and believe all you want. The only way I see myself lying is from reading the bullshit that everyone said I...said-- Did I say that right?

And his original statement was that he runs everything at max settings @ 1680x1050 and there was no need for people to buy 8 series cards when they came out, because the 7900gs can run everything great already.

Yeah there was no need for ME personally to get the 8800 as I have the 7900GS that thus far handled all MY games on max settings. Now I'm not talking about the cluster of in-game ads and spyware that is BF2142 or games I haven't even heard of, but in this case I did run Oblivion at max settings at 1280x1024.
 
Yeah there was no need for ME personally to get the 8800 as I have the 7900GS that thus far handled all MY games on max settings. Now I'm not talking about the cluster of in-game ads and spyware that is BF2142 or games I haven't even heard of, but in this case I did run Oblivion at max settings at 1280x1024.

Read one of your original posts in this thread:

People who bought a GTX or Ultra a year ago shouldn't be complaining. You knew this was going to happen yet you bought it anyways. You did it to yourself, it's your loss.

When I upgraded a year ago, I knew I didn't need a $600 8800GTX/Ultra, let alone a GTS simply because there weren't any demanding games out by that time that even needed such a card. Even now the only few noticeable games that use the 8 series to its full potential are Bioshock and Crysis (which didn't even come out yet) and it's already been a full year since the cards were released and still the people I talk to are having constant driver issues.

My 7900GS which was $150 a year ago handled and handles every game I played on my native resolution (1680x1050) with max settings and I get decent FPS, even in games like Bioshock. Even though I had the chance to buy a 8800 when they first released (right when my computer exploded), I had no reason to, unlike others who bought it to mainly raise their 3Dmark scores (AKA e-peen scores).

I (and possibly everyone else) can't visually tell an improvement from a $150 card running on max settings with 40 fps vs a $600 card running at 120+ fps in video games and honestly I don't give a shit if I get a 1000 on 3Dmark06. If I can run games fine on max settings, I'm happy regardless of having a low 3Dmark score and an inexpensive computer. I'd rather not pay an extra grand for a benchmark; it looks good on paper but thats about it.

You attack those who bought 8800 cards and said there was no need for them which is complete BS as I have shown multiple times in a variety of games.

And to quote myself:

If you want me to believe you can play oblivion on "max" settings @ 1280x1024 with your 7900gs please put up (some numbers), or shut up. The [H] link I posted was also run on a C2D and OCed 7900gs.
 
Just put down the shovel and stop digging your own grave and have a bit of honour.:) max any game on a 7900 gs have to laugh at some people.:eek: mate i can't max bioshock on my 8800 gts 640mb with reasonable frames.
I think he must consider 0 AA-0 AF max settings then.
 
The exact quote you bolded was me saying the truth. They really shouldn't be complaining because they did it to themselves. I'm sure others were extremely happy with their cards when they bought them at release but this statement wasn't directed towards them. I said this as a direct response from the topic. I didn't read the full 4 pages before I said this but if someone did actually say "Ouch" that was directed to them. I hope that clears that up a little.

As for Oblivion, sadly I don't have it anymore for various reasons. Count on the other posters who did agree with me and played Oblivion with "high" settings using similar cards while getting great results.

Also I said before the avg fps I get on Bioshock with max settings is around 30, I didn't say I could get reasonable frames.
 
The exact quote you bolded was me saying the truth. They really shouldn't be complaining because they did it to themselves. I'm sure others were extremely happy with their cards when they bought them at release but this statement wasn't directed towards them. I said this as a direct response from the topic. I didn't read the full 4 pages before I said this but if someone did actually say "Ouch" that was directed to them. I hope that clears that up a little.

So you are still trying to say there were no demanding games that the 8800 series was needed for a year ago even after all the benchmarks have shown otherwise... are you just delusional?

As for Oblivion, sadly I don't have it anymore for various reasons. Count on the other posters who did agree with me and played Oblivion with "high" settings using similar cards while getting great results.

:rolleyes: who would that be?

And I do like how [H] even with many settings off, or medium was only able to get ~30fps avg with their OCed system, yet you can run all "High" with yours...

http://www.hardocp.com/article.html?art=MTMwNywzLCxoZW50aHVzaWFzdA==

You must have one magical system to run stuff so much better than everyone else.

Also I said before the avg fps I get on Bioshock with max settings is around 30, I didn't say I could get reasonable frames.

Hmm funny how you get avg of 30 when these guys got an avg of 22, and thats w/o any enemies to fight or any weapon FX or anything being used, just running through the level...

How we tested

Since the game doesn’t include a built-in utility for benchmarking, we’re testing BioShock performance with FRAPS, as we do with many other games we test with such as STALKER, Battlefield, and Oblivion. In this case, we manually run through the medical pavilion level of the game after it’s been clear of all the baddies. Our test sequence starts towards the beginning of the level, this area is where the frames per second is at its lowest. This is likely because this area uses shadows extensively. From here we run into the medical pavilion foyer, up the left stairs to surgery, and then hook another left to go to the crematorium entrance. From there we got up another set of stairs to the eternal flame, and that’s where we conclude our manual walkthrough.

http://www.firingsquad.com/hardware/bioshock_directx10_performance/images/bshock1600.gif

http://www.firingsquad.com/hardware/bioshock_directx10_performance/page6.asp

But with that I give up, if you still want to lie about how magically powerful your 7900gs is, go ahead. Its a good card, but its not able to run as fast as you think it can.

Oh, and if you can't tell the difference between the top and bottom of these images:
http://www.hardocp.com/image.html?image=MTE2Mjg1NDM5NDdlWWVIY2F3WlVfNl80X2wucG5n

Its no wonder you think you are always running the highest settings possible when you aren't.
 
Back
Top