NVIDIA using intimidation tactics against Oxide

If DX12 is truly a low-level API then you will have to write different code paths for each architecture that you want to optimize for.

If it's Kepler, run this code.
If it's Maxwell, run this code.
If it's GCN 1.1, run this code.
If it's GCN 1.2, run this instead.

This is all very basic stuff here.
No, you should use feature detection, just like you have to do in D3D 11, then it will work as designed on all current and future GPUs. You might need some special cases for drivers that tell porkies about what features they support though.
 
For one, nvidia started throwing a fit.

That still doesn't qualify as intimidation. HOW did Nvidia intimidate Oxide? Is NV the new Pablo Escobar? I don't think you truly understand what intimidation means.

You are just spreading typical BS FUD.

Once agan, someone who buys nvidia isn't getting the full package of capabilities that nvidia claims their products possess.

You do realize DX12 is in its infancy right? Neither GPU company can claim their cards will fully utilize all DX12 features. Hallock even stated "I think gamers are learning an important lesson: there’s no such thing as “full support” for DX12 on the market today.”


You seem to have skipped this quote from Oxide as well:

"Weather or not Async Compute is better or not is subjective, but it definitely does buy some performance on AMD’s hardware. Whether it is the right architectural decision for Maxwell, or is even relevant to it’s scheduler is hard to say."

Or this summary from the Wccftech article:

"Still, NVIDIA users need not be concerned for the time being. First and foremost, not all games will be as reliant on draw calls as Ashes of the Singularity (or any RTS); moreover, even the Oxide Games’ dev admits that games developed with Unreal Engine 4 probably won’t show significant use of Async Compute, and the list of upcoming games made with UE4 grows each day."

There are over 34+ PC games under dev using UE4. How many are using Nitrous? 1? MS and their exclusive devs have jumped all over the UE4 engine.

Just like with previous DX versions, until actual AAA titles are launched and the API matures you don't know what kind of performance is or is not possible on a set hardware.

You also have to take into account 81% of the discrete market is NV's. Will devs spend 2, 3, 4+ years working on a title that only 19% take full advantage of?

Its too early to count either GPU out and when DX12 games are finally launched new GPU's will be out.

But thanks for the awesome attention grabbing thread title!
 
This forum sure has turned to utter shit lately.
Yeah the name calling and fanboism is just out of control ATM.
Just enjoy the show. Bunch of people (or maybe just a couple w/ multiple logins) who seem to care more about the ethics of video card manufacturers more than just about anything else.

Or maybe they spend equal time debating Samsung vs. Vizio, Whirlpool vs. LG, Liftmaster vs. Genie, Rainbird vs. Orbit..


Rainbird FTW.
 
Don't get me started on Liftmaster and Genie. When I need to reach high places quickly, reliably, and from a company that open-sources its lift-designs for the good of humanity, I accept no substitute for Genie. Fuck anyone who says differently!







:D
 
What did NVidia do to 3DFX? oh, that's right stole their stuff and sued them into oblivion...

drivers that literally burned people's cards up? Check
Distributed known faulty chips and got caught? Check
Lost the console contract with MS because of their greed? Check
Lost the console contract with Sony because they had nothing to offer outside of Targa that gets laid to waste by AMD's all in one solution
Disabled overclocking of notebook chips not once but twice...

Reading comprehension is your friend, you totally missed this part of my post which you even included in the quote:

Meanwhile, Nvidia may have some crappy tactics,

Plus, I am not sure what Targa is? Do you mean Tegra or the Tesla? Both are used quite a bit.

Also you talk about contracts, and its good that AMD won those console contracts, but it wasn't that much money in the grand scheme of things, they are still losing out overall to both Nvidia and Intel. Their entire marketing campaign is basically to keep people bringing up negative marketing against Nvidia, or oversell technical jargon that doesn't end up making as much of a difference in the performance of most games. They still are losing performance wise to Nvidia even with all these 'advanced' technologies they keep toting. So really its not about defending Nvidia, its about calling AMD out on their own crap.

If you want to start comparing the sins of both companies we would have a really long list for both. My point is there is a lot of early knee jerk reaction to everything these days, but it seems like more people put AMD up like they are the golden child, when they have been just as bad as Nvidia.
 
What did NVidia do to 3DFX? oh, that's right stole their stuff and sued them into oblivion...

drivers that literally burned people's cards up? Check
Distributed known faulty chips and got caught? Check
Lost the console contract with MS because of their greed? Check
Lost the console contract with Sony because they had nothing to offer outside of Targa that gets laid to waste by AMD's all in one solution
Disabled overclocking of notebook chips not once but twice...

ROFL! "That's cute, I remember when I had my first beer." :rolleyes:
 
I'm a software dev. & physicist in the DC area. Chances are I'm making more money then you.

Not to be a grammar nazi or anything but I would think someone in the scientific and developer realms would have a decent grasp of the English language.
 
If DX12 is truly a low-level API then you will have to write different code paths for each architecture that you want to optimize for.

If it's Kepler, run this code.
If it's Maxwell, run this code.
If it's GCN 1.1, run this code.
If it's GCN 1.2, run this instead.

This is all very basic stuff here.

No, you should use feature detection, just like you have to do in D3D 11, then it will work as designed on all current and future GPUs. You might need some special cases for drivers that tell porkies about what features they support though.

You're both right. The first one is right at face value. The second one is correct because it probably isn't that low level. They are fooling you guys with all this "down to the bare metal" talk.
 
Fixed for truth

Actually you can never have a decent discussion on here anymore. It has nothing to do with making someone believe.

TBH it's almost all the tech forums nowadays. It's all the same.

Cept for maybe Beyond3d.
 
What did NVidia do to 3DFX? oh, that's right stole their stuff and sued them into oblivion...

drivers that literally burned people's cards up? Check
Distributed known faulty chips and got caught? Check
Lost the console contract with MS because of their greed? Check
Lost the console contract with Sony because they had nothing to offer outside of Targa that gets laid to waste by AMD's all in one solution
Disabled overclocking of notebook chips not once but twice...

How about sabotaging their cards by disabling workstation features to force you buy Quaddro cards or disabling Physx drivers when any other card besides an Nvidia card is detected as a primary card. The secondary card Physx cards becomes useless as punishment for not buying their brand. I mean yikes, that's just shady. Not to mention their use of Freesync or Adaptive Sync in their Nvidia card based laptops. I don't hold any particular brand loyalty but apparently this company can get nothing but praise for being incredibly awful. Their 980Ti card is fantastic and reasonably priced, I'll give them that.
 
Back
Top