TaintedSquirrel
[H]F Junkie
- Joined
- Aug 5, 2013
- Messages
- 12,690
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
its not the type of games, all games can use async for performance benefits.
Lighting algorithms/post process that use compute shaders to do calculations are what is being used now, but physics can be done with compute as well.
If you look at VXGI the GI system it uses heavy compute, I would say in the magnitude of 500+ instructions..... (guessing, haven't looked too closely at it, not my job to do that)
POM with self shadowing will use it which uses around 350 instructions. Flex uses it. Its definitely something that will be used in future games but to what degree is the main issue for Maxwell 2 based cards. I don't think you will see any game that will really hurt Maxwell 2 in the near future if the developers aren't dumb, 82% of the market is not a small number to ignore.
I predict that once more DX12 games start coming out nVidia is going to totally shit-stomp AMD like they always have and it won't even matter that AMD supports async compute and nVidia doesn't.
Loll, developers develop first for consoles then port to pc... I bet that next xbox and ps4 games will be taking advantage of async.
well its not really holding back the industry, its all about timing, nV has always seemed to make architectures that fit with the timing for the developer needs, with Xbox one (something was missing from the sdk until Dx12 arrives), so they ddin't need to worry about that, PS4 does use async code extensively, so games being ported from that to PC's you might see some issues but again it takes time to port, and then we have Vulkan which took a while to get released. So nV is safe on that side too and Vulkan is small in the gaming market, it will grow though hopefully stem Dx's hold on the market.
AMD seems to have better capabilities in the regard for over a certain amount of workload for async shaders, but games aren't going to come out that will need that workload because time was just a little off.
This happened with the x1800 series, it was better with ROP though put than the 7800 series and the x1900 series was better at ALU through put to the 7900 series, but games that needed that kind of ALU and ROP through put didn't show up until the 2900, and 8800 series of cards.
I suggest you read up on Xbox one , that is a no, PS4 yeah.
§kynet;1041834689 said:
It's funny we've been having this debate back and forth for days now and this is the first time I've seen that link. It seems incredibly relevant, surprised it took this long to pop up.§kynet;1041834689 said:
It's funny we've been having this debate back and forth for days now and this is the first time I've seen that link. It seems incredibly relevant, surprised it took this long to pop up.
I actually put that link up, as well as several other titles, which will use Async Compute over at Overclockers.net
Mirror's Edge: http://gearnuke.com/mirrors-edge-catalyst-reach-new-levels-gpu-optimizations-via-async-compute/
Deus Ex: http://gearnuke.com/deus-ex-mankind-divided-use-async-compute-enhance-pure-hair-simulation/
Rise of Tomb Raider
Fable Legends
And a whole slew of others...
E3 2015 had the Fable Legends dev talking about Async Compute... gushing over it really (18mins in): https://www.youtube.com/watch?v=7MEgJLvoP2U
I know I'm not the most popular guy around these days but there's a reason I made such a big deal out of it. I even have Johan Andersson gushing over it (but then again he's been gushing about it for years).
I think only the Unreal4 engine is iffy on the topic. Tim Sweeney says he likes it but that developers need to use it with caution (though he did advocate its use for Unreal4 on the Xbox One).
As for the performance one can expect? Maybe a larger boost than what we saw on Ashes of the Singularity if the Oxide developer is correct in his statement. But who knows really.
Aren't all of these games Q1 of next year for the PC outside of Fable legends?
Epic is not against async compute, Unreal 4 was made for indi developers and people that aren't professional game developers as well as full size professional studios, the warning you have read in the Unreal Engine 4 guide is for the former not the later.
They're all for Q1 2016.. I didn't look beyond that.
At least I kinda figure I'll be able to play these games before Greenland and Pascal hit. I'm still on old Hawaii cards (290x Crossfire).
Fable Legends is the title I'm looking forward too the most. I get to play with my wife, she likes those games, and there's nothing more amusing than gaming with your wife (less yelling at ya to get off the PC ).
Yea I am only looking forward to Deus Ex and Battlefront. Never was a fan of Fable or Mirrors Edge.
Does anyone know of Star Wars Battlefront will use Async?
Edit: Ok who gives a fuck about battlefront. Matchmaking on the PC?......yea my day just went to shit.
Fuck you EA!!
Back on topic.
They're all for Q1 2016.. I didn't look beyond that.
At least I kinda figure I'll be able to play these games before Greenland and Pascal hit. I'm still on old Hawaii cards (290x Crossfire).
Fable Legends is the title I'm looking forward too the most. I get to play with my wife, she likes those games, and there's nothing more amusing than gaming with your wife (less yelling at ya to get off the PC ).
Scott Wasson and David Kanter talk briefly about AOTS and asynch compute
https://www.youtube.com/watch?v=tTVeZlwn9W8#t=1h18m55s
Like I've said elsewhere, god help anyone who bought kepler or maxwell for vr.
Due to the severe vr issues, that makes me think nvidia would have change course more for pascal, I hope they did anyway, otherwise this could be a disaster for them next year.
Scott Wasson and David Kanter talk briefly about AOTS and asynch compute
https://www.youtube.com/watch?v=tTVeZlwn9W8#t=1h18m55s
Like I've said elsewhere, god help anyone who bought kepler or maxwell for vr.
Loll, developers develop first for consoles then port to pc... I bet that next xbox and ps4 games will be taking advantage of async.
If I had a nickel for every time I've read "AMD is in the con-souls, so Pee Cee ports will be better 4 AMD cardz" in the past few years. Hasn't happened.
Now simples have latched onto a new buzzword they don't even understand and its off to the races again. Utter comedy.
We actually just chatted with Nvidia about Async Compute, indeed the driver hasn't fully implemented it yet, but it appeared like it was. We are working closely with them as they fully implement Async Compute. We'll keep everyone posted as we learn more.
New post from Oxide.
Probably deserves its own thread.
http://www.overclock.net/t/1569897/...ingularity-dx12-benchmarks/2130#post_24379702
I suppose the B3D thread was mostly accurate. Driver issue.
If that statement had been included with the previous post, this whole situation could have been avoided. Or Nvidia could have just let people know their Async was fully implemented when the original benchmarks went live 2 weeks ago! Smells like some damage control here, but we'll see.
I think I'm more upset with Hallock in this whole situation.Lets see what Hallock has to say about it, he might have to do some damage control on his end too
I think I'm more upset with Hallock in this whole situation.
We have one post from Oxide, which clearly states they haven't worked through the issues yet, and a buggy benchmark at B3D which they even acknowledge was unreliable.
And Hallock takes to social media spouting nonsense as fact. If this gets resolved over the next few weeks, I hope the handful of extra sales AMD got will outweigh the damage control they have to do for Hallock. Even if he ends up being right, it's obvious he jumped the gun. Embarrassing and shameful.
They lie to consumers, misrepresent products, to boost their sales. I will never buy another product from AMD. /s
I think I'm more upset with Hallock in this whole situation.
We have one post from Oxide, which clearly states they haven't worked through the issues yet, and a buggy benchmark at B3D which they even acknowledge was unreliable.
And Hallock takes to social media spouting nonsense as fact. If this gets resolved over the next few weeks, I hope the handful of extra sales AMD got will outweigh the damage control they have to do for Hallock. Even if he ends up being right, it's obvious he jumped the gun. Embarrassing and shameful.
They lie to consumers, misrepresent products, to boost their sales. I will never buy another product from AMD. /s