AMD's DirectX 11 Lead is 'Insignificant,' NVIDIA Says

you guys need to stop living in the past. not having dx 11 is of zero importance. it just does not matter. consider that nearly all the exclusive pc games that came out this year didn't even use 10. most exclusive pc games only use 9x. most of the ports use 9x. crysis 2 was prehaps the only hope but with that going multi-platform. in all seriosness a superfast dx9 card is thats required as long as xbox360 contiunes to drive pc games.

DirectX10 never made it far because 90% of the world was still using XP. With million millions of Win7 sales and the gradual move from XP to Vista with new computer purchases.

DirectX11 will be different. Like drivers being ready for Win7, thanks to Vista, DirectX can now move ahead, also thanks to Vista and Win7.

NVidia is wrong. DirectX11 needs to be pushed so we can get back on track again.
 
Edit: somehow I deleted a few words in my last post :p

DirectX10 never made it far because 90% of the world was still using XP. With million millions of Win7 sales and the gradual move from XP to Vista with new computer purchases, DirectX can finally advance again.

DirectX11 will be different. Like drivers being ready for Win7, thanks to Vista, DirectX can now move ahead, also thanks to Vista and Win7.

NVidia is wrong. DirectX11 needs to be pushed so we can get back on track again.
 
NVidia is wrong. DirectX11 needs to be pushed so we can get back on track again.
Nvidia will continue their support of game devs and I'd bet they are working with them now helping them understand how to best take advantage of Fermi in DX11.
 
As long as they (video card manufacturers) don't try and pull proprietary shit like 3Dfx (GLide) and Matrox (PowerVR) did, I won't care what API the devs use. Also as long as they move forward and stop hugging the past because some people won't let go of XP or because of consoles.
 
you guys need to stop living in the past. not having dx 11 is of zero importance. it just does not matter. consider that nearly all the exclusive pc games that came out this year didn't even use 10. most exclusive pc games only use 9x. most of the ports use 9x. crysis 2 was prehaps the only hope but with that going multi-platform. in all seriosness a superfast dx9 card is thats required as long as xbox360 contiunes to drive pc games.

Wow, then your PC is no better than a $200 Xbox.

People who don't have the hardware can't see the purpose or the benefits. Get the hardware first before saying some thing is not important.

Will future console run DX9? Nope. The PC has always been the test bed for future console games. Pushing DX11 now will only ensure consoles using DX11 or higher in the near future. All this talking about consoles and porting makes me laugh. Were talking about the future of gaming, not old 2005-2006 repackaged hardware.
 
DX11 isn't everything......
I want eyefinity. If I get DX11 compatible hardware with it, so much the better.
I'm still pissed that I can't run multiple monitors off of my SLi setup, and it's going to take a hell of a lot from nVidia if they want my next 2 or 3 generations of video hardware to be their products.

Are you listening nVidia? You can start by releasing a drive update that allows mulitview with SLi setups......

You *can* run multiple monitors on a SLI setup. Nvidia added it into their drivers over a year ago.
 
Wow, then your PC is no better than a $200 Xbox.

People who don't have the hardware can't see the purpose or the benefits. Get the hardware first before saying some thing is not important.

Will future console run DX9? Nope. The PC has always been the test bed for future console games. Pushing DX11 now will only ensure consoles using DX11 or higher in the near future. All this talking about consoles and porting makes me laugh. Were talking about the future of gaming, not old 2005-2006 repackaged hardware.

no one is going to push directx 11. developers are going to wait for the new console dev kits. no one is going to invest developing for an api that might not be in either the next xbox or ps, that's just the reality of the situtation. the last game that use objectively more power than current generation consoles was crysis, and the next one won't.
 
They are correct, the only DirectX that will matter is what will go in the next gen consoles. Sad but true..
 
The XBox consoles do not use DX, but a derived API which does make porting easier, but isn't a 1:1 match.

I never said the Xbox used DirectX. I was just pointing out that it does use Direct3D, which is one of the major APIs in DirectX (and in fact, the two terms are often used interchangeably). DirectX is just a bunch of APIs wrapped up into one nice neat package (one of which is Direct3D), so once again, it's hard to claim that it is -completely- nonexistent on the Xbox.

Direct3D (the 3D graphics API within DirectX) is widely used in the development of video games for Microsoft Windows, Microsoft Xbox, and Microsoft Xbox 360.
 
I never said the Xbox used DirectX. I was just pointing out that it does use Direct3D, which is one of the major APIs in DirectX (and in fact, the two terms are often used interchangeably). DirectX is just a bunch of APIs wrapped up into one nice neat package (one of which is Direct3D), so once again, it's hard to claim that it is -completely- nonexistent on the Xbox.

Well, they streamlined the D3D implementation and API for the XBox since it uses fixed hardware. No use in going for a generic design and implementation when you have one GPU to use. It most definitely isn't the same thing. Note that I never said that it doesn't exist in some form on the XBox consoles, just that they went with a custom API there.
 
no one is going to push directx 11. developers are going to wait for the new console dev kits. no one is going to invest developing for an api that might not be in either the next xbox or ps, that's just the reality of the situtation. the last game that use objectively more power than current generation consoles was crysis, and the next one won't.

Actually this is completely wrong, video card makers and other hardware companies will always push the latest gear on game developers...why do you think nVidia has the "TWIMTBP" program ? ATi has the same type of program in place to work with the game studios....even the current console games are all developed on PC's. Already next gen PC games are leaving the consoles way behind...just look at the differences between the Xbox360 version of MW2 and the PC version....the Xbox360 might as well have a big "welcome to 2006" sign on it while the PC version looks fantastic on a high end gaming PC. The one difference here is if your a console game developing studio you see no reason to develop a more graphically advanced game while sales are strong. We all know how stupid it is to pay 60 $ for a crappy console game and then 20 $ more every 2 months for some download content...but people keep doing it. Once console developers start whining they cant do the fancy stuff their PC game developing buddies can you can rest assured the hardware in consoles will move to the next level...

But this misses the whole point anyway...who gives a rats ass about console games ? I'm a PC gamer and there are plenty of titles available and a whole lot more in development. People keep acting like PC gaming is dying so why buy new hardware...the same BS has been thrown out consistently year after year ever since somewhat decent consoles came out...but nothing has really changed at all. PC's lead the way for console games...always have, always will.
 
What do you mean by this?

I meant we need to move away from past API we're stuck with because of XP's popularity. XP sucked in everybody, developers, businesses, users, schools, because it was allowed to stay around for much too long.

Having a very successful OS is nice and all, but it also causes technology to stale as people aren't willing to break free of it.

DirectX10 failed to pull people away because it was clunky and slow, and Vista's lack of popularity didn't help much. It's up to Windows 7 and DirectX11 and whatever new API (OpenGL, Fermi, Eyefinity, etc) to get us away from it.

NVidia isn't helping with that comment.
 
NVidia isn't helping with that comment.
They aren't hurting DX11 with it either. Games are already slated to release in 2010 with DX11. they are just saying they might be late to the party, but the party will still in full swing when they arrive. :p
 
Will future console run DX9? Nope. The PC has always been the test bed for future console games. Pushing DX11 now will only ensure consoles using DX11 or higher in the near future.

Nobody outside of Microsoft and their 3rd party devs uses DirectX. That includes on consoles. Do you think that PS3 or Wii use DirectX? They don't, they use OpenGL. Neither did the Gamecube, PS2, Dreamcast, Nintendo DS, PlayStation Portable, sizable majority of smartphones, movie production companies, etc.

OpenGL is absolutely poised to make a comeback of some sort in the consumer market (it remains dominant in the professional market). It until the last few years had been allowed to fall well behind in areas useful for game development. That really isn't the case nearly as much anymore and in some ways it outpaces Direct3D, such as with tessellation which it had has support for literally years before DX11 released.

The only thing pushing DX11 now does is keep them close in feature parity and help prevent game devs from moving to OpenGL en masse.
 
Nobody outside of Microsoft and their 3rd party devs uses DirectX. That includes on consoles. Do you think that PS3 or Wii use DirectX? They don't, they use OpenGL. Neither did the Gamecube, PS2, Dreamcast, Nintendo DS, PlayStation Portable, sizable majority of smartphones, movie production companies, etc.

OpenGL is absolutely poised to make a comeback of some sort in the consumer market (it remains dominant in the professional market). It until the last few years had been allowed to fall well behind in areas useful for game development. That really isn't the case nearly as much anymore and in some ways it outpaces Direct3D, such as with tessellation which it had has support for literally years before DX11 released.

The only thing pushing DX11 now does is keep them close in feature parity and help prevent game devs from moving to OpenGL en masse.

Console ports are bad enough already, thanks. No need to make it easier for companies to make shiity console ports on the PC.
 
As someone with 2 5870s I can say they're right.. its not like I use DX11 on anything I play or do...
 
Ditto.



nVidia is in no place to talk trash unless they release a card that is the modern equivalent of a 8800GTX.

Actually they're in precisely the position one would need to be in to say who cares if we miss the first few months with (next to) no software support, we'll be there before there's anything worth getting DX11 for. Dirt 2 has shadows under boulders that you drive by in a fraction of a second anyways, yippee!
 
Those who do not remember history are doomed to repeat it.

ATi has won a grand total of two rounds against nVidia, ever--the 9700 vs. 5xxx era, and the short-lived X19xx vs. 79xx era.

After the first one, nVidia 6xxx came out and dominated the landscape. You can still buy the things in the ultra-economy segment, for pete's sake.

And what was nVidia's answer to the X19xx? Years of market and performance leadership with the g80, g92, and g200. How many launch delays and missed cycles did we see out of ATi in that era?

I particularly chuckle at the indignation aroused in some people by nVidia's re-branding schemes. "I was robbed!" No you weren't (by the tone of your posts, I doubt you ever buy nVidia regardless), and neither was anyone else. The products were reviewed and compared to those of competitors. If performance was not acceptable, buy something else. If it was acceptable, why complain? Fact is, nVidia could get away with re-branding older architectures for the following simple reasons: 1. Improvements in process technology and design optimization led to measurable performance gains even on respun older architectures. 2. ATi was not offering anything that performed better.

I don't know how a crippled, limping, lackluster AMD/ATi managed to beat nVidia out the gate this time. But by March or April of next year, no one will care if Fermi is out and it rocks. ATi hasn't won a third round just by being first to market. Either nVidia will have to be VERY, VERY late, or Fermi will have to stink. A student of history would not take that bet.
 
Those who do not remember history are doomed to repeat it.

ATi has won a grand total of two rounds against nVidia, ever--the 9700 vs. 5xxx era, and the short-lived X19xx vs. 79xx era.

After the first one, nVidia 6xxx came out and dominated the landscape. You can still buy the things in the ultra-economy segment, for pete's sake.

And what was nVidia's answer to the X19xx? Years of market and performance leadership with the g80, g92, and g200. How many launch delays and missed cycles did we see out of ATi in that era?

I particularly chuckle at the indignation aroused in some people by nVidia's re-branding schemes. "I was robbed!" No you weren't (by the tone of your posts, I doubt you ever buy nVidia regardless), and neither was anyone else. The products were reviewed and compared to those of competitors. If performance was not acceptable, buy something else. If it was acceptable, why complain? Fact is, nVidia could get away with re-branding older architectures for the following simple reasons: 1. Improvements in process technology and design optimization led to measurable performance gains even on respun older architectures. 2. ATi was not offering anything that performed better.

I don't know how a crippled, limping, lackluster AMD/ATi managed to beat nVidia out the gate this time. But by March or April of next year, no one will care if Fermi is out and it rocks. ATi hasn't won a third round just by being first to market. Either nVidia will have to be VERY, VERY late, or Fermi will have to stink. A student of history would not take that bet.

Now that's a fanboy rage.
 
Console ports are bad enough already, thanks. No need to make it easier for companies to make shiity console ports on the PC.

I agree OpenGL equals watered down games for PC. I'm not surprised that I've been buying less games because most of them are the same trash from year to year. Only a few games stand out each year. There's no need for some of the console devs to push the envelope, some of them are comfortable with the current market of making money for mediocre work. If the same game is going to be ported across all platforms, what is the need for a PS3, 360 and PC platform? To make money, that's all. No version is better than the other. It's all the same crap. The consumers will be better off having one platform, that's what the devs really want. DX11 will hopefully motivate more PC devs to make games exclusively for the PC again.

Whether you're buying a PS3 or 360, your getting the same game. With DX11, PC games will look better and the experience will be more immerse than a OpenGL port. Devs are getting too lazy. The market of exclusive games is shrinking because of this. Once a great game for one platform is now an average game port for all platforms.
 
I agree OpenGL equals watered down games for PC. .

Nonsense. You think that any multi-platform game is first developed for PC? They start with the lowest common denominator (like the XBox 360) and then port it to others (PS3, PC). Virtually all of the games which got this treatment and also ended up running DX on Windows are watered down compared to what they could have been on a modern PC.

My company develops OpenGL-games for Windows at this point. Once we start developing for consoles, we will develop on PC first as it makes sense, then port to the consoles. It's easier to scale down game assets than it is to upgrade them for a higher-end system (the XBox 360 is a pile of 2006 hardware, after all).

OpenGL is and has always been on-par or ahead of D3D. D3D is the one who picks features from OpenGL which it likes (or rather MSFT does). D3D is the sluggish to change API whereas OpenGL thanks to its extensions for example had hardware tesselation a few years ago already through extensions. There's no chance of such a thing happening with DX11.

OpenGL matters more than D3D.
 
Why aren't the devs pushing for OpenGL if it's better? Are they being stubborn with DX?

How does OpenGL compare to DX11 when it comes to tessellation?
 
Why aren't the devs pushing for OpenGL if it's better? Are they being stubborn with DX?
I guess it's mostly an artifact from when DX truly was a collection of APIs. These days the only useful API in DX is D3D. DirectInput and others got stripped out and moved to the Windows Platform SDK or depreciated. The Network API in DX isn't used often either and DirectSound got mutilated in Vista and up (no 3D effects, no hardware acceleration).

OpenGL plus OpenAL and whatever network library fits your purposes should do more than fine these days. It's virtually the same or better as developing using DX. Dev studios just have to migrate to new tools for use with it, I guess those assets are what's keeping most of them from switching.

How does OpenGL compare to DX11 when it comes to tessellation?
OpenGL has had hardware tesselation on AMD cards since about 2006/2007 via an extension. I thought nVidia has such an extension as well, but I'm not sure. It's the same kind of tesselation as DX11 has. Software tesselation has been possible using both APIs for a while now. I'd gladly see hardware tesselation make it into the OpenGL ARB spec since now a game engine has to allow for both the AMD and nVidia extensions.

Then again, adding tesselation to object models adds a significant workload, so I doubt we'll see it used very often.
 
Elledan said:
I thought nVidia has such an extension as well, but I'm not sure. It's the same kind of tesselation as DX11 has.

Your last two posts have too many factual errors to count, but these take the cake (The tessellation hardware on r6x0/r7x0 is less programmable than the tessellation defined in the D3D 11 specification. In addition, obviously Nvidia does not have an OpenGL extension for tessellation).

Elledan said:
D3D is the one who picks features from OpenGL

That one also gave me a good laugh (there are so many counter examples to this that I don't even know where to begin).
 
it just doesn't matter, crysis is multiplatform so they can do cod numbers. valve still uses quake 2/3 tech, blizzard. developers are in it to make money and there is no money in producing a full fledge top to bottom directx 11 game, that died with crysis.

Console are the end all and be all of game development right now, and they don't use directx 11. it's not ten years ago or five years ago.

until people stop pirating the sh#$ out of games like crysis then maybe it might matter that ATI has directx 11, otherwise it signifies nothing.
 
"We're almost there"

D'oh!

Translation: We blew our wad and we're still stuck in the refractory period.

I haven't seen much arguing. The fact is: if nvidia produced the first DX 11 video card, I am sure as well as many others would agree that nvidia would be bragging all about the card. And I don't see why nvidia wouldn't.

For me, it's not about nvidia cards being better or worse, it is about the crap they've been spewing out lately.

This past year alone, what we've heard from nVidia and the fiasco regarding the G8X defective cores was why I went ATI for my new build, not to mention...

The only thing they need now is a set of bad drivers that fry peoples hardware.

...The issues I had with another build. Everything worked fine with the GeForce 8400 GS. Had dual monitor working though video acceleration was a little choppy/laggy. Updated the drivers... and lost dual monitor. Completely. (Appears to be quite a common issue, too) It was clone or blank. Had to nuke the drivers and stick with the old versions that it came with. The card also runs hot as hell.

nVidia isn't going anywhere for a while, but they do seem to be wanting to hasten their demise. Last time I went ATI was Rage II Pro. Bad drivers, lackluster offerings and lousy bang for the buck kept me away all these years. Got a 4670 for cheap. Couldn't have been happier.
 
the thing with OpenGL is that its going to make games multiplatform... and if most games are available on most platform, I'll be using Ubuntu for daily use and does not have to switch to Windows for gaming and that's not really good for Microsoft.
 
Oh this is ridiculous, ATI cant do DX10 properly yet... in comparison to a GPU with the same performance in DX9 with ATi and Nvidia.. the Nvidia GPU slaughters the ATi in DX10..
 
Oh this is ridiculous, ATI cant do DX10 properly yet... in comparison to a GPU with the same performance in DX9 with ATi and Nvidia.. the Nvidia GPU slaughters the ATi in DX10..

sure if you can call the 3% lead a slaughter
 
I meant we need to move away from past API we're stuck with because of XP's popularity. XP sucked in everybody, developers, businesses, users, schools, because it was allowed to stay around for much too long.
I for one don't really see the continued use of SM3.0/SM4.0 as a particular issue. We don't need DX11 to keep improving upon the overall visual fidelity the PC platform offers, though it's advantageous in several respects. Obviously, for us, tessellation is the Big New Thing, but we're also seeing fairly severe performance penalties in its use (see DiRT 2), so we can expect its adoption in games to be fairly slow.

What I'm getting at is that the lack of DX11 adoption isn't a hindrance to increasing visual fidelity, thus NVIDIA's DX11 part being late to the game is pretty inconsequential in the grand scheme of things. At this point, AMD is getting out there and getting developers on board with DX11, so there's a fairly good degree of momentum there already, and the fact that we already have DX11 games on shelves is something that's genuinely surprising to me given how little time Windows 7 has been out.

I'm not upgrading my DX10 card until I see what GF100 brings to the table, but I'm really not missing out on much (next to nothing at this point), and I don't think the landscape would be any different if NVIDIA were to simply rush out their DX11 part. It's just not time to freak out about this yet.
 
I'm not upgrading my DX10 card until I see what GF100 brings to the table, but I'm really not missing out on much (next to nothing at this point), and I don't think the landscape would be any different if NVIDIA were to simply rush out their DX11 part. It's just not time to freak out about this yet.

Especially considering that for example the XBox 360 doesn't support hardware tesselation, DirectCompute and such features of DX11. Since most games aren't PC-only titles, a developer would be crazy to splurge so many resources on something which only 10% of its audience will ever see and appreciate. It won't make sense until the next series of consoles are out with upgraded GPUs.
 
that's great for you, but for those DX11 games you want, the hardware has to come first, it has to get out to game developers, the sooner means the sooner you'll see DX11 games that will entice you to upgrade, read my above post

Actually, I'm still waiting for something significant on the DX10 front but yeah...

The features that DX10 has shown in the current batch of games has been rather disappointing. DX9 still looks fantastic in comparison but I know a lot of DX10 fans wear beer goggles.
 
bah some of you guys are content with the original Tomb Raider's introduction of bouncing boobs, but I want to see graphical advancements, even if it isn't much.

Every little addition counts.
 
Especially considering that for example the XBox 360 doesn't support hardware tesselation,

Ok this needs to stop. Can you make one post without an inaccuracy?
 
Back
Top