AMD/ATI Tempts game developers with DX10.1

Tamlin_WSGF

2[H]4U
Joined
Aug 1, 2006
Messages
3,114
I'm not here to tout AMD/ATI's horn, but I welcome DX10.1.

It looks like more DX10.1 games are coming besides Diablo 3 and the rest from Blizzard, Sega and EA. Valve might be thrown into the game. Perhaps we even get DX10.1 support for older titles as well? :cool:


An article 1.july from TechPowerUP!:
It is learned that AMD is looking to team up with developers for implementation of the DirectX 10.1 features. An 'old friend' of ATI, Valve could just have DirectX 10.1 implemented with ATI apparently making sure that happens. The next major revision of the Source engine that drives upcoming game titles such as Half Life 2: Episode 3, Portal 2 and Left 4 Dead could just be DirectX 10.1 titles looking at the flexibility of the Source engine and the ease with which new technologies could be incorporated into it.

Game developers have a tendency to play it safe though, and whether there will be any exclusive effects or not remains to be seen. There is no reason as to for why they shouldn't implement DirectX 10.1 though, the worst-case-scenario is that people with compliant hardware will get a performance boost where DX10.1 makes a difference over DX10.0. On the surface, DirectX 10.1 is touted to be more of a feature upgrade than performance.

http://www.techpowerup.com/64456/AMD_ATI_Tempts_Game_Developers_with_DirectX_10.1.html?64456

Here's an introduction to DX10.1 (thats not ATI PDF :D):
http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=103&Itemid=29

Coders about DX10.1:
What does DX10.1 add vs. DX10 that makes such a big difference? [deferred shading]
about the downplaying of DX10.1

ATI's newest PDF about DX10.1:
http://ati.amd.com/products/pdf/DirectX10.1WhitePaperv1.0FINAL.pdf
 
get over it it, dx10 on xp wont happen, ever, at least not done by MS.

nice to see ATI could be back in the game full forfce pushing the feature set of it's new cards
 
get over it it, dx10 on xp wont happen, ever, at least not done by MS.

nice to see ATI could be back in the game full forfce pushing the feature set of it's new cards


Its nice to see DX10.1 getting implemented and I hope Nvidia will come to support it as well. :)

First now, after DX10.1 came with SP1 are the support coming. Assassins Creed was first out, but they only had a minor use of DX10.1 (AA). Microsofts promises about what DX10 could deliver didn't hold. DX10 is not only slower then DX9, but the features used didn't give enough for the performance impact to be worth it. With DX10.1, it should be worthy to use DX10.1 code compared to DX9. A bit of this is also a poorly porting from DX9 to DX10, which gave even further performance impact.

From what I've read, it shouldn't be too hard to update current DX10 games with DX10.1 features. AA is supposedly not only faster, but also better. Global illumination (which was demonstrated in ATI's pingpong ball demo) gives more 3d feeling as well. Not as well as ray-tracing, but closer then before.

Its optimized DX10.1 and my hopes are that they implement it in all the games on the marked now that currently support DX10. Should give some free performance increase for many users on current games I think. :D
 
if anyone can help AMDTI right now it is Valve.


Valve has always been there for ATI, so its not suprising if they are updating their old titles and bringing DX10.1 to their new ones.

Assassins Creed, as mentioned above, is a Nvidia "the way its ment to be played" title, but before the removal of DX10.1, it didn't play as well on Nvidia cards due to DX10.1 features. It wasn't only faster AA, but also better looking one if to believe Rage3d:

The impact that DX10.1 has is quite frankly greater than expected. A 20% improvement in AA performance is very impressive. GPU refreshes have been built around lesser gains. Based on the game's behavior it's possible that Ubi are doing their own Custom Resolve in order to get HDR correct AA, and are also taking advantage of the full access to MSAA buffers DX10.1 brings in order to get both better quality and better performance, but without an official statement we'd rather refrain from speculating further.
http://www.rage3d.com/articles/assassinscreed/index.php?p=5

Faster and better looking then DX10. Of course we want DX10.1!

If only Nvidia would support it too, so that coorporate money politics wouldn't spoil our chances of getting it on all titles! :mad:

Edit:
Following the link from the OP:

If AMD has played its cards right, Blizzard is not only looking to implement DX10.1 support, and in best-AMD-case scenario, DX10.1-exclusive effects, with future games, but also patch current games with DX10.1 for additional performance and effects. If everything comes together for AMD, this means that StarCraft 2, Diablo 3, future World of WarCraft add-ons, and more games from Blizzard will sport DX10.1, something which NVIDIA is unable to handle hardware-wise.
http://www.nordichardware.com/news,7904.html

Lets hope more will follow!
 

So these guys are saying that the next gen nVidia GPU will support DX 10.1 AND be 1.5 to 2 times faster then the gen just released. That's interesting. If this is true this performance crown will go from 4870x2 to the next nVidia part in only three months after 4870x2.

Well guys you wanted competition, looks like we might be getting it. I guess its good AMD's cards are cheap cause this gen from both sides might well be junk by Turkey Day.
 
Its expected that a new gen GPU gets at least 1.5 and hopefully 2 times the power of previous gen. But, I don't think it will make the current gen obsolete even so. If thats the case, then the 9800GTX should be obsolete already, which its not.

The truely good news is that Nvidia cards will get DX10.1 support, which wasn't that expected, but more hoped. :)
 
id say q1 = march for most companies. So 6 months for the 4870x2.

Then 9 months from march its obsolete. Most likely windows will be pushing dx 11 by then
 
wow i can only hope ati comes out with something good as well if nvidia's really is that good ;) :p
 
4870x2 by itself is going to be %80 faster than 4870 besides new radeons will be ready for that time
 
id say q1 = march for most companies. So 6 months for the 4870x2.

Then 9 months from march its obsolete. Most likely windows will be pushing dx 11 by then

DX11 is under developement, but its going to take at least until 2009 before a finished SDK is ready for developers I think. DX11 hardware might take a while as well, since they need to have an SDK to test it up against. Looking forward to DX11 too though! :D

It depends what will happen to CUDA then:
According to some insiders, Compute Shader technology in DirectX11 will possibly terminate the future of NVIDIA's CUDA technology.
http://www.pczilla.net/en/post/36.html

Besides that, the upcoming in games are DX10.1 and it will be a joy when they arrive!

sneak peak of DX 11 August 26. :)
 
This says it all straight from the horse's mouth:
Microsoft's senior global director of Microsoft games on Windows, Kevin Unangst, replied, "DX10.1 is an incremental update that won’t affect any games or gamers in the near future."
http://forum.beyond3d.com/showpost.php?p=1092612&postcount=5

I have 2 DX10.1 cards (HD3870, HD4850), but hyping it up isn't going to make a big difference in what developers choose to support. If they need it, and want to budget time and money for it, they will implement it. It's not a big deal otherwise.

It depends what will happen to CUDA then:

http://www.pczilla.net/en/post/36.html
That's just dumb. It won't "terminate" CUDA because CUDA will still run and support the base of legacy code. But if MS adds in a GPGPU API, it may eventually siphon off support from other GPGPU APIs, not just CUDA. If MS does a good job with it (not something that is guaranteed... see first iterations of almost anything MS does), of course it could be a welcome addition.
 
Of course DX10.1 is an incremental update. Its not DX11. If you remember when MS came with this statement, it was because there were a lot of bad publicity about DX10.1 needing hardware to support it. Its not a software only update, but requires GPU supporting it. Since DX10.1 wasn't even out yet, it wouldn't effect anyone. DX10.1 was a part of Vista SP1 and here is also from the same horse's mouth (from this year, not last year):

Adds support for Direct3D® 10.1, an update to Direct3D 10 that extends the API to support new hardware features, enabling 3D application and game developers to make more complete and efficient use of the upcoming generations of graphics hardware
http://forums.vr-zone.com/showthread.php?t=232519

I have 2 DX10.1 cards (HD3870, HD4850), but hyping it up isn't going to make a big difference in what developers choose to support. If they need it, and want to budget time and money for it, they will implement it. It's not a big deal otherwise.

Actually it is a big deal. DX10 have been too inefficient and costly compared to the graphical benifits vs. DX9. DX10.1 brings more goodie to the table that makes it more worth it. If you read my OP, I've given out enough information for most to see that DX10.1 is an update that would benifit gamers if implemented. And yes, if people are more aware of the benifits, it will be more interesting for developers to implement it as well.

Hyping it up helps gamers much more then downplaying it, which only helps GFX manufacturers which don't have it implemented in their hardware yet. I don't care for GFX manufacturers that much, so I prefer the first option, you prefer the second one I guess. DX10.1 IS, without any doubt, more efficient and gives more eyecandy then DX10 run on hardware that supports both.


That's just dumb. It won't "terminate" CUDA because CUDA will still run and support the base of legacy code. But if MS adds in a GPGPU API, it may eventually siphon off support from other GPGPU APIs, not just CUDA. If MS does a good job with it (not something that is guaranteed... see first iterations of almost anything MS does), of course it could be a welcome addition.

I wouldn't dismiss it too fast, THAT would be dumb. If the Compute Shader technology in DirectX11 bypasses the API as they talked about, then the CUDA API might not be compatible enough with DX11. Remains to be seen though.
 
So these guys are saying that the next gen nVidia GPU will support DX 10.1 AND be 1.5 to 2 times faster then the gen just released. That's interesting. If this is true this performance crown will go from 4870x2 to the next nVidia part in only three months after 4870x2.

Well guys you wanted competition, looks like we might be getting it. I guess its good AMD's cards are cheap cause this gen from both sides might well be junk by Turkey Day.

ROFL. Sour much?

Remember when GT200 was hyped up to the sky with 240SP, a smegload of bandwidth and "1.5x shader improvements"? Aka as fast as 3 Ultras in SLI? And even more because the 9600GT that had ONLY 64 SPs were so good?


Hah.hah.ha.
 
now all we need is DX10 on xp and the circle would be complete

Not going to happen, ever.

I wasn't that excited about DX10.1 to start with, little in the way of feature updates for the users and mostly development improvement, but if that translates into the kind of performance difference we see in Assassins creed when used correctly, that's great.

Im glad I went AMD this round, DX10.1 support may come in handy soon and Nvidia are going to be left out to dry. This is coming from an Nvidia fanboy :p
 
There are some features I REALLY want to see in DX10.1 games, especially global illumination:

The main advantages of DirectX 10.1 include global illumination delivering lighting and shadow quality in real-time that matches the ray tracing techniques used in CG films, improved anti-aliasing techniques to clean up distracting shimmering artifacts, and tighter specifications for improved compatibility.
http://www.legitreviews.com/article/584/1/

Here's a avi/divx file showing global illumination in action (about 7mb):

mie3pm.jpg

http://graphics.ucsd.edu/~henrik/animations/jensen-the_light_of_mies_small.avi
 
Great, ATI is finally back. NVIDIA should have implemented DirectX 10.1 with their GT200 cards, god knows they didn't, guess they finally opened their eyes.
 
Great, ATI is finally back. NVIDIA should have implemented DirectX 10.1 with their GT200 cards, god knows they didn't, guess they finally opened their eyes.

I hoped they did implement it in the GT200 (I was actually waiting for that card since the g100 chip rumors).

DX10.1 games are coming and regardless of previous attempts made by some companies to downplay it, nobody have denied that its an improvement over DX10. I think everyone agrees upon that! :D
 
Actually it is a big deal. DX10 have been too inefficient and costly compared to the graphical benifits vs. DX9. DX10.1 brings more goodie to the table that makes it more worth it.
You ignore how long it takes to implement full support of new features and the extent that developers support subsets of features (they generally don't... Crytek and Valve are exceptions). By the time there is any meaningful support of DX10.1 features, it will be available in virtually all cards. Early availability of a feature doesn't guarantee it will be used widely soon after release. If AMD has stronger developer relations like nvidia does, that might change the situation a bit.

You don't seem to understand what CUDA is or how it works. Hint: it compiles to SM4 shaders and the driver includes a run-time for the APIs. It has little to do with the DX API.
 
You ignore how long it takes to implement full support of new features and the extent that developers support subsets of features (they generally don't... Crytek and Valve are exceptions). By the time there is any meaningful support of DX10.1 features, it will be available in virtually all cards. Early availability of a feature doesn't guarantee it will be used widely soon after release. If AMD has stronger developer relations like nvidia does, that might change the situation a bit.

In your eager to downplay DX10.1, I think you are complicating things more then whats realistical. As we discussed earlier in this thread, DX10.1 is an "incremental update". DX10.1 doesn't need a fully rewrite of the game code nor the engine code. This is also why its discussed in articles I've linked to earlier about implementing DX10.1 even in existing games.
Also, considering that Nvidia is coming with DX10.1 cards finally, there will be even more reason for game developers to implement support for it.


You don't seem to understand what CUDA is or how it works. Hint: it compiles to SM4 shaders and the driver includes a run-time for the APIs. It has little to do with the DX API.

No need to become patronizing. Again:

According to some insiders, Compute Shader technology in DirectX1 11 will possibly terminate the future of NVIDIA's CUDA technology
http://www.pczilla.net/en/post/36.html

Edit: About compute shaders:

The Direct3D API imposes some constraints on the processing model in order to achieve optimal rendering performance. Direct3D 11 introduces the Compute Shader as a way to access this computational capability without so many constraints. It opens the door to operations on more general data-structures than just arrays, and to new classes of algorithms as well. Key features include: communication of data between threads, and a rich set of primitives for random access and streaming I/O operations. These features enable faster and simpler implementations of techniques already in use, such as imaging and post-processing effects, and also open up new techniques that become feasible on Direct3D 11–class hardware.

http://www.xnagamefest.com/conference_details08.htm

Perhaps game developers will sacrifice this and use CUDA instead, but its bloody unlikely.

Since DX11 haven't been introduced and its only on the speculation stage yet, I wouldn't dismiss that CUDA might not be an advantage. Its not nessesarily a matter of rendering CUDA useless, but if DX11 support alternate methods with higher efficiancy and broader support, CUDA might not even be interesting for gamers and developers.

Speaking of DX11 ...

According to the new features advertised in DX11 ... Most of them are already supported in DX10.1 (but not in DX10). Maybe not to the extent of full DX11 hardware support, but its not unlikely that a game will support both and perhaps even a software support of DX11 with DX10.1 cards. Speculations only, since its not released yet.

When DX11 will be implemented depends, since DX10.1 are supported currently by S3 and ATi, and by Nvidia in Q1 2009. Games are coming now with DX10.1 support as mentioned earlier, so downplay DX10.1 as much as you wish. It will be supported and it is the near future in games. I'm looking forward to it!
 
Back
Top