Stardock CEO: DX11 vs. DX12 = 13fps vs. 120fps

My buddy who works for Stardock pointed me to this: https://twitter.com/draginol/status/566360528264908800
He said he already saw what it was, and he said it's fuckin' crazy, but he can't tell me what it is cuz of NDA. I fucking hate teases. He said the original plan was to announce it yesterday, but now they are waiting for GDC.

Hmmmm, let's speculate.
They increase performance of DX12 then make a statement about something running in 4K.
 
Has Stardock ever made a game that actually needs high framerate? Or hell even pushed the boundaries of existing technology? Oh wait no they haven't.
 
Pie in the sky until actual games with DX12 and DX11 apart and we see benchmarks
 
Cool,
Then XBOX One can get DX 12 and run a million times faster than the PS4!

It doesn't need this. You know how unplugging the Kinect gained you 10% performance? Here's some next-level stuff: Disconnect the controllers as well and you'll gain even more! Then you might get usable framerates.

:D
 
Using an unspecified branch, I was able to throw it 920% farther in an unspecified environment under unspecified conditions.

IF I TWEET IT IT MUST BE TRUE GAIZ
 
That is quite an impressive increase in performance. How many games in development right now are going to use DX12 or can the games that are on the market now be updated to take advantage of DX12?
 
That is quite an impressive increase in performance. How many games in development right now are going to use DX12 or can the games that are on the market now be updated to take advantage of DX12?

Any game already released or close to release is highly unlikely to ever support DX12.
 
As has been already posted, Wardell is highly respected in the PC dev community. He doesn't spout bullshit.
 
That's quite a substantial increase in performance, but I'm kind of worried that it will only be in specific scenarios. I'm excited, but I am skeptical. Either way if DX12 really works that well, I'll have my hats off to AMD for pushing MS to actually improve it as much as they did.

Please, journalists, start getting this right.

It already started at 100% performance, meaning it increased by 820% for a total of 920%. You say it increased by 920% which is wrong.

Why do so many people suck at math even when they're technically-oriented?

Yes, a lot of people get this wrong. I usually don't bother pointing it out as I see it as nitpicking. Just because people are technically oriented does not mean that they are mathematically oriented. Also it's really not a question of "math orientation" as it is an issue with the way they state things. I mean it's subtle. 125% faster vs 125% as fast (N*1+N*1.25 vs N*1.25). It's really a language issue more than a math issue. The math involved is not complex.

There are also a lot of people that something related wrong. I'm trying to remember what I mean when I say this, but I think it's something along the lines of:

N*1.25=1.25N

Self explanatory, right?

However:
(1.25N)/1.25 != 0.75*(1.25N)

I see a lot of people make this kind of mistake. Basically when something is 25% faster (this is relevant for GPU speeds for instance), they automatically think that to find the speed of whatever it is faster than, they take 1-0.25. So if model 2 ran at 125 Mhz, they mistake model one as running at 93.75Mhz. In reality it's 100Mhz (1/1.25=0.8, so 0.8*125=100). (In this scenario, N was 100, obviously)

Ah I see your point now, they should have done (120-13)/13 * 100 ~ 820%.

What you're doing is equivalent to (120/13 - 1)*100, or 920-100. 120/13 is really the only necessary part as it is the ratio (the basis of your comparison), while +/-100% depends on how you state it in the sentence. One way includes the existing quantity of 13, while one doesn't.
 
I guess the bottom line is that anytime people make claims like these they turn out to be Bullshit.
Remember Dean Kamen and "Ginger"?
It was going to change the world, Cities would be redesigned to accomidate them.
I was like Ralphie after figuring out the Orphan Annie decoder message, "Son of a bitch, it's a scooter"
 
When they say "a 920% increase" they mean performance was 920% of original. This isn't innumeracy, it's a trip-up in language.
 
I guess the bottom line is that anytime people make claims like these they turn out to be Bullshit.
Remember Dean Kamen and "Ginger"?
It was going to change the world, Cities would be redesigned to accomidate them.
I was like Ralphie after figuring out the Orphan Annie decoder message, "Son of a bitch, it's a scooter"
Right, but Wardell isn't a bullshitter though. Kamen is a massive bullshitter.

One must consider the speaker's history of bullshittery when determining whether bullshit is being spoken.
 
Ignoring the hype, it's good that some bottlenecks are finally being addressed in D3D. It's been long overdue.
 
When they say "a 920% increase" they mean performance was 920% of original. This isn't innumeracy, it's a trip-up in language.

It's downright incorrect. It's not trickery. It is just wrong. Whether it's intentional or not is unknown but it's literally incorrect. It's not just a different interpretation of the numbers or of the language. It is wrong. Wrong wrong wrong.
 
It's downright incorrect. It's not trickery. It is just wrong. Whether it's intentional or not is unknown but it's literally incorrect. It's not just a different interpretation of the numbers or of the language. It is wrong. Wrong wrong wrong.

Don't be an autist, please. :D
 
Any game already released or close to release is highly unlikely to ever support DX12.

This is true for a few reasons. #1 once a game ships a developer is usually onto the next thing and there's little financial incentive to go back and re-work an old game (exceptions being the current gen consoles launching and there being financial reason to 'remaster' certain games, and using the opportunity to buff the PC version at the same time), and #2 since DX12 discriminates Windows 7, its going to be years before all but the biggest developers stop building games around DX11, or hell even DX9 games which are still releasing with alarming regularity.
 
That's a lot of fucking overhead in DX11 if those numbers are accurate.

Or, more likely, the drivers for this unreleased GPU have been totally optimized for DX12, and COMPLETELY IGNORED for DX11.

I imagine that by the time DX12 and the hardware in question are both in full release, the difference will be significantly less.
 
Or, more likely, the drivers for this unreleased GPU have been totally optimized for DX12, and COMPLETELY IGNORED for DX11.

I imagine that by the time DX12 and the hardware in question are both in full release, the difference will be significantly less.

On the bright side, at least something....... SOMETHING.. is finally happening to move PC graphics forward. DX11 only came out in 2009.... :rolleyes:

Ofcourse, Khronos' glNext being unveiled by Valve next month at GDC is also going to be very interesting.
 
That's a lot of fucking overhead in AMD's DX11 drivers if those numbers are accurate.

FYP ;)

AMD's drivers does not support multihreading in DX11, unlike NVIDIA's...so those numbers could be very misleading.
 
those episode 1 graphics look not so good.

Does Stardock games even make graphics-intensive or photoreal type games? The Galciv series has always looked about a decade or more out of date.

I'm not sure I trust this guy's opinion.

Galactic civilizations is not a graphically intensive game. The closest stardock has come is with their Sins of a Solar empire series: Technically they were the publishers, but reports are they had teams working closely with ironclad games who developed it.

A space real time strategy game where it's easy to see why they want to increase batch counts and lighting effects for explosions. The company is called STARdock, the guys must be interested in space themes.

https://www.youtube.com/watch?v=OEowIxd_VxY

That is exactly the type of game best suited to be enhanced by mantle/dx12. Shooters, meh, not so much, except perhaps with more light sources.
 
Something struck me.
Why is DX12 now being used to hype AMD hardware...I thought Mantle was the "Shizzle"?
 
This is true for a few reasons. #1 once a game ships a developer is usually onto the next thing and there's little financial incentive to go back and re-work an old game (exceptions being the current gen consoles launching and there being financial reason to 'remaster' certain games, and using the opportunity to buff the PC version at the same time), and #2 since DX12 discriminates Windows 7, its going to be years before all but the biggest developers stop building games around DX11, or hell even DX9 games which are still releasing with alarming regularity.

I believe that's the reason Microsoft is letting all current users within the first year to upgrade to windows 10 for free. This should help quicken the adoption rate.
 
Something struck me.
Why is DX12 now being used to hype AMD hardware...I thought Mantle was the "Shizzle"?

Hmmm cuz DX12 is cross platform locked down Mantle? Same reason it is being used to "hype" 8 core CPU's. Because AMD supplies both :)
Why else wou;d they "hype" it? Hmmmm?

PS- For anyone poo-pooing DX12, and its lack of adoption. This isn't 2008. We all know the majority of game development is console. Console is GCN. GNC low level APi is Mantle, or shades of. MS wants to profit from a new OS, and DX12 for gamers is it.
Largest OS switch in history. And this time for a good reason. DX12!
 
Multi threaded drivers results in almost zero performance benefits, in fact it's weird to see it even mentioned.

Also both DX12 and mantle both prop up AMD cpu's. They both remove the cpu bottle neck and neither software stack generates money for AMD. I'd BET AMD is happy to see DX12 a they wouldn't have to maintain mantle anymore.
 
I'd BET AMD is happy to see DX12 a they wouldn't have to maintain mantle anymore.

Happy or not, I don't know, but as far as the rest of this statement, I'll take that bet. DX12 won't work on existing versions of Windows and thus cannot fully replace Mantle. Also, if AMD pulls Mantle support this early in its life, they will never be trusted by a developer again. There are games that support it and developers went to the trouble to do so and the product should not be immediately discontinued.
 
It seems like MS and AMD are in bed together on manltle/DX12 so that MS can get away from a hardware specific platform and get all there content on multiple platforms. Desktops, laptops and mobile devices all able to deliver the same content on AMD powered devices?... would leave Sony in a pickle, they are all about the hardware, and MS would have more flexibility, instead of trying to sell a console, they can sell a multi-platform gaming/entertainment ecosystem, like cellphones, the hardware could be free, but you have to sign a 2-4 year subscription.
 
Happy or not, I don't know, but as far as the rest of this statement, I'll take that bet. DX12 won't work on existing versions of Windows and thus cannot fully replace Mantle. Also, if AMD pulls Mantle support this early in its life, they will never be trusted by a developer again. There are games that support it and developers went to the trouble to do so and the product should not be immediately discontinued.

AMD already isn't trusted by a lot of developers. If they dip support for mantle while it's in the trying phase I doubt any developer will care.
 
Multi threaded drivers results in almost zero performance benefits, in fact it's weird to see it even mentioned.

Also both DX12 and mantle both prop up AMD cpu's. They both remove the cpu bottle neck and neither software stack generates money for AMD. I'd BET AMD is happy to see DX12 a they wouldn't have to maintain mantle anymore.

It ups the FPS for NVIDIA cards in StarWarm when it is on comapred to off...hence it matters.
 
Multi threaded drivers results in almost zero performance benefits, in fact it's weird to see it even mentioned.

It depends.

When DX11 introduced multithreading, and the drivers finally got around to supporting it it made a HUGE difference in some titles, and not very much of a difference in others.


At that time, I was playing Red Orchestra II a lot. It is a very CPU intensive game because of all the bullet physics on large servers, and was even worse back then, asn the code has been heavily optimized since.

I was on an AMD Phenom II x6 1090T at the time, so the individual cores weren't the strongest, but it did have 6 cores, a lot for the time.

With this combination, as soon as the multithreaded driver support was added , I saw substantial performance improvement. In other titles that didn't task the CPU as much, it had no real effect at all. I'd also imagine on an Intel based system with stronger per core performance, the effect would also have been smaller.
 
All DX11 multi-threading did was spread the workload of wasted CPU resources across multiple core/threads. It was a bandaid solution at best. It hid the symptoms, not the underlying issue.
 
It still gives Nvidia better FPS in StarSwarm.
And you need to compare the complete DX11 perfomance vs DX12 to get the true picture.
Gimped DX11 gives a scewed view, simple as that.
 
Back
Top