Dark Void PhysX Gameplay Performance and IQ @ [H]

yes i do realize that. i just think nvidia artificially limits the performance of the cpu doing physx. that is why i would like to see what it looks like when physx is bening run on the cpu. does the cpu use go up? down? stay the same?

Does Nvidia also artificially limit the performance of the CPU doing Photoshop? How about Video encoding? How about audio encoding? How about any of the other plethora of applications that are now being down through CUDA? How about AMD/ATI? Are they purposing limiting the perfomance of graphics processing on the CPU? I mean heck, it works a hell of a lot better on their GPU, they must be helping put in code to limit it on CPUs. How about ATI's GPGPU applications? Are they in bed with Nvidia to limit how these applications work on CPUs so that they only look like they perform better on GPUs?
 
yes i do realize that. i just think nvidia artificially limits the performance of the cpu doing physx. that is why i would like to see what it looks like when physx is bening run on the cpu. does the cpu use go up? down? stay the same?
It's well known that PhysX is much more capable on the CPU than NVIDIA allows. However, if it is demonstrated that the CPU is more than sufficient for good game physics, how then could PhysX be a selling point for NVIDIA GPUs? Furthermore, why would NVIDIA invest anything beyond the bare minimum into developing PhysX on the CPU? It's just a lure to get support in a game that they can then add GPU hardware support to if a deal is reached.
 
This article is a review of what the user will get if they buy this game and run it on NVIDIA and AMD hardware. No more, no less. No editorial for the most part, just reporting. We leave you guys to make up your own minds on this one and I did not feel the need to chime in. It is what it is. Vote with your wallet.
 
while I usually prefer Nvidia over ATI, this physx stuff just doesnt do anything for me. the effects to me just arent worth the framerate penalty.

really I wish this physx crap would just die as all it ever does is cause an argument every time its even mentioned.
 
Why bother, this game is merely OK, only a bit of eye candy is lost, and it is joining a still, after all this time, tiny list of AAA GPU accelerated PhysX titles.

So why did you ever bother reading about the game and the comments if it so unimportant? ;)
 
PhysX is good for gaming and more important then the performance stealing bitch FSAA ever will be or ever was. This should shut up all the PhysX nay-sayers once and for all. You can see the difference big time. Its right on your face!

nuff said...

it actually makes a huge difference in games that have it, you can see from this article, 7 PhysX Comparison Videos You Need To Watch http://www.gamephys.com/2009/12/29/7-physx-comparison-videos-you-need-to-watch/

now is PhysX needed for those effects? that is a different story.
 
If you want to see the PhysX effects in Dark Void in action there are a couple of great PhysX comparison videos you can watch, check them out here http://www.gamephys.com/2010/01/21/nvidia-dark-void-gpu-physx-comparison-video-released/

also while on the subject does anyone have any clue what the deal with the PhysX benchmark they released is? how come you can only select the low setting, i can't figure it out, what kind of benchmark is it if it cant do all the PhysX settings in the game?
 
If you want to see the PhysX effects in Dark Void in action there are a couple of great PhysX comparison videos you can watch, check them out here http://www.gamephys.com/2010/01/21/nvidia-dark-void-gpu-physx-comparison-video-released/

also while on the subject does anyone have any clue what the deal with the PhysX benchmark they released is? how come you can only select the low setting, i can't figure it out, what kind of benchmark is it if it cant do all the PhysX settings in the game?

Good question, perhaps they have been too busy refining the game to put out a full version? Might be they through together a low version for benchmarking to give a small taste of what was upcoming and might put out another full version later after the game release and the first batch of fixes are put out?
 
it actually makes a huge difference in games that have it, you can see from this article, 7 PhysX Comparison Videos You Need To Watch http://www.gamephys.com/2009/12/29/7-physx-comparison-videos-you-need-to-watch/

now is PhysX needed for those effects? that is a different story.

Is that a different story or is that THE story?

I think that's the main issue I have - in DV, they could have put generic smoke or particle effects in for the 'non physx' or 'low' physx that would look 80% of the physx effects instead of just completely leaving them out. However, there's no selling point for physx then - you'd realize that adding a 2nd video card for the proper swirling of the smoke and movement of the particles vs. just having some generic smoke and particles wasn't worth even $10, let alone the $100 for an 8800GT or something similar. Could the CPU with 2 spare cores do some smoke and particle effects that looked 95% close to the 'correct' physx enabled physics? I don't see why not, but as I said before, it's a stupid marketing ploy - not much different than those '64 bit' enabled patches that came out for the athlon 64 that just added some extra pieces of scenery. You had to have an athlon 64, but was it necessary from a technological standpoint? Nope.
 
Is that a different story or is that THE story?

I think that's the main issue I have - in DV, they could have put generic smoke or particle effects in for the 'non physx' or 'low' physx that would look 80% of the physx effects instead of just completely leaving them out. However, there's no selling point for physx then - you'd realize that adding a 2nd video card for the proper swirling of the smoke and movement of the particles vs. just having some generic smoke and particles wasn't worth even $10, let alone the $100 for an 8800GT or something similar. Could the CPU with 2 spare cores do some smoke and particle effects that looked 95% close to the 'correct' physx enabled physics? I don't see why not, but as I said before, it's a stupid marketing ploy - not much different than those '64 bit' enabled patches that came out for the athlon 64 that just added some extra pieces of scenery. You had to have an athlon 64, but was it necessary from a technological standpoint? Nope.
Yup. A lot of dirty work goes into these marketing schemes to prey on the average customer who has no idea what's going on. I've been saying it for awhile: the time is right for any company to release a pisser physics engine with good multi-core CPU support and scaling. There are a few out there, but so far no one has taken up the torch. As long as there's nothing to compare it to, stupid crap like PhysX will persist.
 
Yup. A lot of dirty work goes into these marketing schemes to prey on the average customer who has no idea what's going on. I've been saying it for awhile: the time is right for any company to release a pisser physics engine with good multi-core CPU support and scaling. There are a few out there, but so far no one has taken up the torch. As long as there's nothing to compare it to, stupid crap like PhysX will persist.

Crytek is making their own Physics engine for Crysis 2 and Cryengine 3 http://www.pcgameshardware.com/aid,...ngine-3-from-Crytek-founder-Cevat-Yerli/News/
 
Most games should, there is no "support" it needs, only that it can support higher resolutions, which most games can, since the game only sees one large display. I will check Dark Void in Eyefinity tomorrow and see if it can detect the higher resolutions.

Yes, I am aware that "most games" may support it, but whether or not the fov may be modified is a different story. To be fair, I was expecting a comparison to Eyefinity since you so clearly stated that Physx adds to the gameplay experience. Why yes, it looks better than no Physx, Eyefinity also adds to the experience.

I'm not suggesting that there should have been Eyefinity benchmarks, but a simple statement of whether or not it works with a corrected fov would have made the article better. I realize the article is titled "PhysX gameplay performance and IQ", but it is unfair to allude that the gameplay experience is better with PhysX without crediting the gameplay experience with Eyefinity.

Anyhow, I suppose enough time has been spent looking at such a mediocre release. Batman AA is the only game with PhysX that was actually decent in my opinion.
 
So why did you ever bother reading about the game and the comments if it so unimportant? ;)

It is not hard to figure out. It's important enough read/write about, but not important enough to buy a video card over. :) Besides. I already have a 9800GTX+ I used to see if the difference was that big.

It's another Batman AA to me. Devs are either too lazy to include cpu effects/features of similar graphical quality common in games 2 years ago, or Nv and something fishy..
 
Last edited:
I think that's the main issue I have - in DV, they could have put generic smoke or particle effects in for the 'non physx' or 'low' physx that would look 80% of the physx effects instead of just completely leaving them out.

If I'm not mistaken they do have simpler, generic effects with PhysX off. There is billboard smoke and precanned "sparks" etc. It's really no different to other heavy duty effects like HDAO, at least in this case the physics effects are actually noticeable without a microscope.
 
I tried to read all comments before posting... I think I caught them all.

Kyle, I'd like to ask you this question since you have a bit more of an inside track than most. Is there any work around for adding a cheap NVidia card along with the ATI?

I disagree with many of the posts here. Hardware companies have always courted PC Games to include special features which give their hardware an edge. Heck, anyone who plays games with EAX and has moved to Vista or Win 7 knows this... you need a soundcard like Creative to get the full positional sound in may titles, like UT2k4. Anyway, I have little issue with NVidia requiring a person buy their video card to play PhysX, however, I think it's poor that they will not allow you to use their product in conjunction with the AMD card. I for one would own both if Nvidia did this... for now I only own an AMD.

So, maybe as you ask this question of Nvidia, let them know, people will buy the fastest card. ... BUT... if Nvidia allows extra eye candy they will get some of our cash rather than non. :)

Thanks.
 
I tried to read all comments before posting... I think I caught them all.

Kyle, I'd like to ask you this question since you have a bit more of an inside track than most. Is there any work around for adding a cheap NVidia card along with the ATI?

I disagree with many of the posts here. Hardware companies have always courted PC Games to include special features which give their hardware an edge. Heck, anyone who plays games with EAX and has moved to Vista or Win 7 knows this... you need a soundcard like Creative to get the full positional sound in may titles, like UT2k4. Anyway, I have little issue with NVidia requiring a person buy their video card to play PhysX, however, I think it's poor that they will not allow you to use their product in conjunction with the AMD card. I for one would own both if Nvidia did this... for now I only own an AMD.

So, maybe as you ask this question of Nvidia, let them know, people will buy the fastest card. ... BUT... if Nvidia allows extra eye candy they will get some of our cash rather than non. :)

Thanks.

There is a driver hack for that. It doesn't support the latest drivers, but you really don't need those for PhysX anyway. The hack works very well.
 
Does Nvidia also artificially limit the performance of the CPU doing Photoshop? How about Video encoding? How about audio encoding? How about any of the other plethora of applications that are now being down through CUDA? How about AMD/ATI? Are they purposing limiting the perfomance of graphics processing on the CPU? I mean heck, it works a hell of a lot better on their GPU, they must be helping put in code to limit it on CPUs. How about ATI's GPGPU applications? Are they in bed with Nvidia to limit how these applications work on CPUs so that they only look like they perform better on GPUs?

my only point is that this: http://www.tomshardware.com/reviews/batman-arkham-asylum,2465-10.html

when physx is enabled with batman aa the cpuutilization went down. i just want to know if it does the same with this game.

so when i see this

1264988480RqkmnHyBPw_7_3.gif


it makes me wonder what the cpu utilization looks like.
 
while I usually prefer Nvidia over ATI, this physx stuff just doesnt do anything for me. the effects to me just arent worth the framerate penalty.

really I wish this physx crap would just die as all it ever does is cause an argument every time its even mentioned.

This makes no sense whatsoever! You guys will take the hit for FSAA ( which in my opinion does jack and shit for quality) but not PhysX that actually adds effects to the game?

LOL
 
Curious if any of the computer journalists have ever done research into the real driver of this eye candy debate. IS it truth that AMD cards simply cant render this type of detail, or is it more true that the game developers got paid off to use PhysX instead of a more open technology that allows both cards to render more advanced visual effects? I know there are trade offs and value adds to physics processing on the card (and how Nvidia has chosen to implement it) but with its limited applications today it seems to me more of an agressive marketing ploy than anything.

I think the deeper question the industry should be asking is WHY are developers really choosing to use PhysX? Does it truly offer the end user some value over other solutions and rendering options out there, or is it completely a marketing game Nvidia is playing against AMD by pushing its technology on developers in an effort to make it appear that their product is superior when it isnt? Manufacturing reasons to pick a hardware solution by forcing the market makes me feel pretty blah about ever buying Nvidia again. Particle effects and glowies aside.
 
This makes no sense whatsoever! You guys will take the hit for FSAA ( which in my opinion does jack and shit for quality) but not PhysX that actually adds effects to the game?

LOL
you have been nothing but a troll for the entire thread and did I even mention AA at all? but since you brought it up, when the fuck would turning on a little AA bring my minimum framerate down to the 20s or teens like even low physx will do? if you enjoy the weak little effects as opposed to a much higher framerate then thats great for YOU. I was clearly saying that for ME its not worth it.
 
You do realize that even if using our multiple cores at 100% they cannot handle physx like this do you?
That doesn't mean they can't be optimized. instead of 50,0000 particle affects a cpu could handle 20,000 at the low setting. There's absolutely no reason not to try.


Don't you think there is a reason CUDA performs so much better for certain applications than a CPU?
Irrelevant to the discussion. Photoshop and all other applications are available to people who want to buy an ATI card.


I am sure if optimize they could do a bit better with CPU but first, I do not think they care for that of course, and still it wont match the GPU performance so why bother?
Because not everyone wants to spend 100$ for a PhysX card that can handle the "high" preset, and even less people want to spend 500$ on a video card.

The gtx 295 couldn't even enable AA at high resolutions while having the preset set to high. If the flagship card which is ALREADY essentially two cards cannot handle a proprietary standard, there is something seriously wrong with that standard.

Plus, it was on UE3. 48 avg fps and 17 min is seriously terrible for an engine that old.


A freakin used 8800gt is dirty cheap so just get one and stop complaining.
PhysX is a joke and needs more than one card to run effectively now. Unless you're swimming in money, you'd have to be delusional to side with nVidia on this.
 
That doesn't mean they can't be optimized. instead of 50,0000 particle affects a cpu could handle 20,000 at the low setting. There's absolutely no reason not to try.

Irrelevant to the discussion. Photoshop and all other applications are available to people who want to buy an ATI card.

Because not everyone wants to spend 100$ for a PhysX card that can handle the "high" preset, and even less people want to spend 500$ on a video card.

The gtx 295 couldn't even enable AA at high resolutions while having the preset set to high. If the flagship card which is ALREADY essentially two cards cannot handle a proprietary standard, there is something seriously wrong with that standard.

Plus, it was on UE3. 48 avg fps and 17 min is seriously terrible for an engine that old.

PhysX is a joke and needs more than one card to run effectively now. Unless you're swimming in money, you'd have to be delusional to side with nVidia on this.
Excellent post. I think [H]'s findings show just how much NVIDIA has gimped PhysX to run best on their hardware, except that their hardware can't handle it, which, to be honest, is pathetic. What a perfectly good waste of a physics engine.
 
The whole point of making games is to allow people to enjoy them, not prevent half the player-base from seeing the game as the developers created it.

There are other physics engines that don't prevent people from being able to experience the game as it was created.

It is 1 thing to put in new directX features and wholly different to make half of your potential consumers experience the game in a substandard way.
 
IS it truth that AMD cards simply cant render this type of detail, or is it more true that the game developers got paid off to use PhysX instead of a more open technology that allows both cards to render more advanced visual effects?

Neither. Nvidia invested in the hardware and software to enable physics acceleration on their products. AMD didn't. End of story.
 
Neither. Nvidia invested in the hardware and software to enable physics acceleration on their products. AMD didn't. End of story.

Yep. AMD has had multiple chances to license it, but their stance is that they don't wish to support CUDA or PhysX due to its closed nature. Or something like that.
 
Well if I was AMD I wouldn't want to license the competition's proprietary platform either. But their problem is that they don't have an alternative to offer (as yet).
 
The point is more about the fact that there really isnt even a need to push some of this content into the realm of PhysX. Are all of these little enhancements things that can only be done using PhysX? Most of the time, NO. So why would AMD invest in NVIDIA's technology when it is little more than a scam at this point?

The other question would be why are the game developers using PhysX and not using an open standard of development that can do the same or better graphics.

The investment NVIDIA made into PhysX is null if it is a closed standard, and offers no real benefit to gamers. The way they are marketing it and sometimes lying about it is suspect to me. Kyle has his finger on the pulse of this as well. I just dont think it is a big enough problem in the market to warant a huge expose yet. With games like this coming out more and more it will become a larger issue of NVIDIA forcing Devs to screw us out of a solid game experience just for choosing a competing product.
 
The other question would be why are the game developers using PhysX and not using an open standard of development that can do the same or better graphics.

twimtbp. how much do you think nvidia gave them to put physx into the game?
 
twimtbp. how much do you think nvidia gave them to put physx into the game?

I'm not actually sure what your position is, but many people would argue that the practice you're referring to is extremely unethical. TWIMTBP and other nVidia idiocy was what convinced me to go with a 5850, as I for one will not tolerate such underhanded marketing tactics.
 
That doesn't mean they can't be optimized. instead of 50,0000 particle affects a cpu could handle 20,000 at the low setting. There's absolutely no reason not to try.
You are just guessing numbers there....very scientific approach...try yes but whats the point, it wont be as good an Nvidia does not sell CPUs...do you think ATI should spend time helping the guys doing SoftTH?

Irrelevant to the discussion. Photoshop and all other applications are available to people who want to buy an ATI card.
Irrelevant is your reply to that.
I meant that as an example of how much faster GPUs can be at some tasks than CPUs not that Photoshop was Nvidia only product due to CUDA...how about learning to read a bit?

Because not everyone wants to spend 100$ for a PhysX card that can handle the "high" preset, and even less people want to spend 500$ on a video card.
First, I buy all in the used market...I am not a snob that needs everything brand new.
Second, not everyone needs Eyefinity, or 3D Vision or even a dedicated video card for that matter so what is your point? But it is good to have choices isn't it?

The gtx 295 couldn't even enable AA at high resolutions while having the preset set to high. If the flagship card which is ALREADY essentially two cards cannot handle a proprietary standard, there is something seriously wrong with that standard.
High Physx recommends a dedicated card so no surprise there.

PhysX is a joke and needs more than one card to run effectively now. Unless you're swimming in money, you'd have to be delusional to side with nVidia on this.
Swimming in money for a 8800gt at $50 in the used market or a new but slower card (still better than nothing) on the new market? Perhaps someone needs to look for a new job cause if you cant afford that you are in the wrong hobby...look at 360 or PS3 for budget action. ;)
 
First, I buy all in the used market...I am not a snob that needs everything brand new.

lol People who buy new cards are snobs, you heard it here. Because it isn't snobby to generalize a group of people as snobs.
 
First, I buy all in the used market...I am not a snob that needs everything brand new.
Perhaps someone needs to look for a new job cause if you cant afford that you are in the wrong hobby...look at 360 or PS3 for budget action. ;)
Yeah, you aren't a snob at all.

Anyone who recommends buying a dedicated PhysX card automatically loses any and all credibility he/she had.
 
The point is more about the fact that there really isnt even a need to push some of this content into the realm of PhysX. Are all of these little enhancements things that can only be done using PhysX? Most of the time, NO. So why would AMD invest in NVIDIA's technology when it is little more than a scam at this point?

The other question would be why are the game developers using PhysX and not using an open standard of development that can do the same or better graphics.

The investment NVIDIA made into PhysX is null if it is a closed standard, and offers no real benefit to gamers. The way they are marketing it and sometimes lying about it is suspect to me. Kyle has his finger on the pulse of this as well. I just dont think it is a big enough problem in the market to warant a huge expose yet. With games like this coming out more and more it will become a larger issue of NVIDIA forcing Devs to screw us out of a solid game experience just for choosing a competing product.

I am going to be real honest and say that I wanted to see what the community was thinking on this and did not want to write a conclusion that looked to lead you guys one way or another. What YOU think is important here more than what I think.
 
TWIMTBP and other nVidia idiocy was what convinced me to go with a 5850, as I for one will not tolerate such underhanded marketing tactics.
Joining up with NVIDIA and TWIMTBP doesn't prohibit developers from joining up with AMD as well. There's nothing to suggest that developers get paid to be a part of TWIMTBP, either. If you want to speculate that it's all about cash changing hands between NVIDIA and developers, cool, but making buying decisions based on speculation about NVIDIA's so-called "underhanded marketing tactics" seems a bit odd to me.
 
Joining up with NVIDIA and TWIMTBP doesn't prohibit developers from joining up with AMD as well. There's nothing to suggest that developers get paid to be a part of TWIMTBP, either. If you want to speculate that it's all about cash changing hands between NVIDIA and developers, cool, but making buying decisions based on speculation about NVIDIA's so-called "underhanded marketing tactics" seems a bit odd to me.

True, but AMD has been teaming up with developers for multi monitor gaming support, DX11, and tessellation support which can be used on both ATI and nvidia platforms, once nvidia begins supporting all those things. TWIMTBP normally consists of pushing hardware physx and locking ATI users out of certain features.
 
You are just guessing numbers there....very scientific approach...try yes but whats the point,

The point is people have a choice as to what hardware they use. Urr durr.

it wont be as good an Nvidia does not sell CPUs...do you think ATI should spend time helping the guys doing SoftTH?

A. I'm sure some other company can make sparks, water, and cloth.

B. They're already supporting open-source standards by promoting their usage.


Irrelevant is your reply to that.
Oh snap, can someone hand me some ice for that burn?

I meant that as an example of how much faster GPUs can be at some tasks than CPUs not that Photoshop was Nvidia only product due to CUDA

I suppose this would be a good time to illustrate the benefits of PhysX, because you totally dodged my question by simply C+Ping your original argument.

You don't need nVidia GTX dual-slot (tm) technology to run Photoshop. It's available to everyone, and honestly a quad core does well on basic image manipulating. Also, CUDA has nothing to do with nVidia forcing proprietary standards on corporations by denying them R&D money. That's what the discussion is about.

You're basically trying to justify nVidia forcing ATI out of the market by saying that the company does other things to. It is IRRELEVANT. :rolleyes:

...how about learning to read a bit?

NEW with nVIDIA (tm) (c) HYPER CONDESCENDING technology (c) (tm) you can defeat your argument twice as fast by pulling at straws with your GPU!

Never has losing an argument been this InnoVative (tm)!


First, I buy all in the used market...
Good. I'm glad that nVidia didn't see any profits from your purchase.

I am not a snob.
Thanks for clearing that up, when you told me to learn to read I was getting worried.

Second, not everyone needs Eyefinity, or 3D Vision or even a dedicated video card for that matter so what is your point?

If they want to run on "PhysX- High" with anything other than a dual card flagship 600$, then yeah apparently they do. Run it on medium, you say? Why would a company so hell-bent on forcing every consumer to have an nVidia GPU make cards that can't even run their ridiculous forced standard?

Furthermore...

High Physx recommends a dedicated card so no surprise there.

nVidia should allow the CPU and their GPU to work in tandem so that the consumer doesn't have to buy the absolute top end if they want to experience the game... well... the way it's meant to be played. But they don't.

But it is good to have choices isn't it?
Choices between a GTX 420 and GTX Hyper 599? No. Consumers want real choice that spans off of the GeForce line and into what the other companies are offering. A choice between a box of oatmeal and a dual card box of oatmeal that looks like a vhs does not leave room for delicious bacon. Delicious ATI bacon.




Swimming in money for a 8800gt at $50 in the used market or a new but slower card (still better than nothing) on the new market? Perhaps someone needs to look for a new job cause if you cant afford that you are in the wrong hobby...look at 360 or PS3 for budget action. ;)
:rolleyes:

I am not a snob.
:confused:
 
Joining up with NVIDIA and TWIMTBP doesn't prohibit developers from joining up with AMD as well. There's nothing to suggest that developers get paid to be a part of TWIMTBP, either. If you want to speculate that it's all about cash changing hands between NVIDIA and developers, cool, but making buying decisions based on speculation about NVIDIA's so-called "underhanded marketing tactics" seems a bit odd to me.

Right, the reason why there's never (to my knowledge) any ATI branded games is simply because publishers want to put the nVidia logo on their box's cover art. I'll buy that.
 
Why not use ATI+Nvidia Physx to get a better comparison?

Nvidia disabled this option in a driver update. They don't want you mix and matching cards even though it works becaues it could cause 'unexpected' results...like a crash? Yet when games don't crash, it's removed anyways.

It's basically nVidia doesn't want technology that can work on ATI/nVidia combos to work as heaven help us if we all bought 5970s and then a cheap-o 9800 GT for PhysX to get the best of both worlds while giving most of our money to ATI. The world would end!! ohh noes!
 
TWIMTBP normally consists of pushing hardware physx and locking ATI users out of certain features.
I don't even really know if it entails 'pushing' PhysX, but that might very well be a part of it now, yeah. I'd suspect, though, that a developer is going to make the decision to include PhysX before entering any kind of relationship with NVIDIA and leverages TWIMTBP to ease the implementation of HW PhysX. NVIDIA gets games with their proprietary hardware physics shit and developers get help with the implementation.

Right, the reason why there's never (to my knowledge) any ATI branded games is simply because publishers want to put the nVidia logo on their box's cover art. I'll buy that.
AMD no longer has much of a developer relations division. The reason why you rarely see an ATi/AMD logo on a game box is because NVIDIA's 'devrep arm' is much larger and presumably more focused than AMD's. If AMD put as much resources into dev relations as NVIDIA presumably does with TWIMTBP, you might see a lot more AMD/ATi logos on boxes.

TWIMTBP is designed to be a mutually beneficial thing. NVIDIA gets their logo on the box and an intro video and, in exchange, developers don't have to go through the process of optimizing shaders and other graphics-related code and throwing company resources at it. The catch, of course, is that you'll likely get some NVIDIA-specific optimizations that may hinder performance on AMD boards. The solution is to get AMD to throw their ring in the hat as well, which developers just don't seem to do much of (so get angry at them about it, not NVIDIA).
 
Last edited:
Back
Top