Stardock CEO: DX11 vs. DX12 = 13fps vs. 120fps

Please, journalists, start getting this right.

It already started at 100% performance, meaning it increased by 820% for a total of 920%. You say it increased by 920% which is wrong.

Why do so many people suck at math even when they're technically-oriented?
 
This is way 2 vague in my book, all he is stating is there is lighting and lens effects, I assume, at least that this was done at a decent resolution with AA and such otherwise its just a stupid statement to make.
 
I'm sure they'll release new GPUs that perform a lot better in DX12 than DX11 at some point in the future. But this kind of hyperbolic rhetoric, caused by analysis of a small set of feature (which are likely to be the ones that perform the best), is not really the sort of thing that needs to be out in public.

DX12 is a revolutionary API, yes, but It's not magic.
 
Please, journalists, start getting this right.

It already started at 100% performance, meaning it increased by 820% for a total of 920%. You say it increased by 920% which is wrong.

Why do so many people suck at math even when they're technically-oriented?

It says n-% increase.
A 100% performance increase means performance effectively doubled.
 
It says n-% increase.
A 100% performance increase means performance effectively doubled.

Right, but it increased 820% but he included the original 100% in his claim, saying it increased 920%. It is 920% the performance, not a 920% increase. It increased 820% for a total of 920%.

Nearly every tech writer on any site I browse makes this mistake and it gets old. I put a certain degree of trust in these writers to help me decide what products I buy but when almost all of these people can't do basic math, it makes them hard to trust.
 
That's a lot of fucking overhead in DX11 if those numbers are accurate.
 
That's a lot of fucking overhead in DX11 if those numbers are accurate.

It's clearly a corner case i.e. in the real world you won't see this sort of improvement. Still, as long as it doesn't have any major regressions in other cases, it's still nice.
 
Right, but it increased 820% but he included the original 100% in his claim, saying it increased 920%. It is 920% the performance, not a 920% increase. It increased 820% for a total of 920%.

Nearly every tech writer on any site I browse makes this mistake and it gets old. I put a certain degree of trust in these writers to help me decide what products I buy but when almost all of these people can't do basic math, it makes them hard to trust.

Ah I see your point now, they should have done (120-13)/13 * 100 ~ 820%.
Just went through some older articles of random tech sites, and you're right, majority of them seem to have no clue about this most basic of math concepts.

Then again, they are "journalists"....
 
Please, journalists, start getting this right.

It already started at 100% performance, meaning it increased by 820% for a total of 920%. You say it increased by 920% which is wrong.

Why do so many people suck at math even when they're technically-oriented?

A spreadsheet formula that I keep handy:

Percentage Increase/Decrease =((B1-A1)/A1)*100

Which yields an increase of 823% as you point out. Likewise, DX 11 represents an 89% decrease compared to DX 12.
 
Stardock’s CEO, Brad Wardell, claims that he saw a 920% increase in performance between DirectX 11 vs. DirectX 12 on an unreleased GPU. :eek:

I claimed to see Big Foot in an unspecified location and time while chugging large amounts of Tylenol with codeine... But i posted it on twitter so must be true.

Screen shots or it didn't happen.
 
Brad Wardell is pretty well respected in the PC/Windows community. I don't think he's just making this up though upon inspection I'm sure there's bound to be lots of debated as to what this is actually all about.
 
920% is weaksauce, the author shoulda said. "DX12 offers a 1,000,000% increase in performance using my magical GPU made from pure extract of leprechaun farts, take my word for it!"
 
920% is weaksauce, the author shoulda said. "DX12 offers a 1,000,000% increase in performance using my magical GPU made from pure extract of leprechaun farts, take my word for it!"

UU2dM.gif
 
I don't see what the issue is. It's probably a test case that happened to be inefficient on DX11 or maybe even done on the CPU which is now much more efficient or moved to the GPU. Think of things like processing video moved from the CPU to GPU using CUDA... many of those cases see a 20-50 times increase in speed (2000-5000% of the original). I very much doubt you will see a game suddenly go from 50fps using DX11 to 460fps in DX12.
 
Considering how much DX11 is CPU bound, this is likely on an AMD processor with a higher end GPU. DX12 lessens the impact of the lesser efficiency of the AMD processor. This is especially likely considering he mentioned "8 core CPU." The only Intel 8 core CPU is the highest end socket 2011 chip, while AMD's 8 core CPUs are pretty much the mainstream chip. I think it was likely an FX-8320 at stock speeds, the slowest of the 8 core CPUs.
 
If you are curious about what Brad Wardell is working on then you should drop by his personal blog. Especially check out this interesting post on if DX12 and Mantle can deliver movie quality visuals in games.

That was a good read, thanks. I like dev guys who can explain their craft in ways that even a non-dev such as myself can understand it clearly. He seems very confident that DX12 is going to be a huge improvement. Would be great to bring some older graphic cards back to life, but fat chance that AMD/Nvidia add DX12 support to them.
 
Right, but it increased 820% but he included the original 100% in his claim, saying it increased 920%. It is 920% the performance, not a 920% increase. It increased 820% for a total of 920%.

Nearly every tech writer on any site I browse makes this mistake and it gets old. I put a certain degree of trust in these writers to help me decide what products I buy but when almost all of these people can't do basic math, it makes them hard to trust.


This is why I hate when tech sites and products use the intentionally confusing % instead of x when referencing performance. For the mass majority we often see 100% as double on first impact until we pay attention to the wording. If you say it's 1.5x or 8x the performance that's more easily understood.

Marketers can be brilliant bastards sometimes doing these sly little tricks.
 
those episode 1 graphics look not so good.

Does Stardock games even make graphics-intensive or photoreal type games? The Galciv series has always looked about a decade or more out of date.

I'm not sure I trust this guy's opinion.
 
Just what I need, more lens flare in games.

Exactly what I was thinking.

Lens flare is in no way a desirable effect, except for in some unusual circumstances where you are looking through a lens in game.

Games shouldn't artificially appear more like movies, they should appear more like you were there in person, and unless you have really shitty glasses, the world doesn't have lens flare effects :p
 
This is why I hate when tech sites and products use the intentionally confusing % instead of x when referencing performance. For the mass majority we often see 100% as double on first impact until we pay attention to the wording. If you say it's 1.5x or 8x the performance that's more easily understood.

Marketers can be brilliant bastards sometimes doing these sly little tricks.

this!! better to say it performs 9.2x better than dx11.

I remember getting into the config file of Battle for Middle Earth game and it had lots and lots of math errors in the multipliers vs percentage of certain bonuses some characters had.
 
this!! better to say it performs 9.2x better than dx11.

Still incorrect. It performs "9.2x as well as dx11." It performs 8.2x better. Better = increase = you still add the original 100% to this figure to get overall performance.

But hey - your job isn't to write about these products, so it's acceptable for someone like you to make this mistake. It isn't the end of the world for Steve and other writers to make the mistake but personally I expect better considering this is his actual job.
 
Zarathustra[H];1041436841 said:
Exactly what I was thinking.

Lens flare is in no way a desirable effect, except for in some unusual circumstances where you are looking through a lens in game.

Games shouldn't artificially appear more like movies, they should appear more like you were there in person, and unless you have really shitty glasses, the world doesn't have lens flare effects :p

I would prefer is even the movies lacked lens flare. That's one aspect of JJ Abrams' Star Trek I absolutely hated.
 
I can only imagine they are using the function with the most dramatic performance difference primarily. I doubt other games get any sort of boost like this.
 
It's clearly a corner case i.e. in the real world you won't see this sort of improvement. Still, as long as it doesn't have any major regressions in other cases, it's still nice.

Yes and no. You won't see the improvement in most games, because certain techniques have such poor performance under DX11 and older that they could not be used. New games that do use those techniques will need DX12 or a similar API to run properly.
 
920% is weaksauce, the author shoulda said. "DX12 offers a 1,000,000% increase in performance using my magical GPU made from pure extract of leprechaun farts, take my word for it!"

Bit Boys Oy Glaze 3D?
 
I would prefer is even the movies lacked lens flare. That's one aspect of JJ Abrams' Star Trek I absolutely hated.

It bothered me a little, but my bigger annoyance was the frequent handheld camera shots bouncing around that almost made me seasick.

That - and the whole alternate universe discontinuity of the story line. That really pissed me off.
 
its new hardware that isn't optimized for dx11 but has tons of optimization for dx 12 that is directly related to the things it was running.
 
If this is true, Microsoft has been sabotaging PC gaming for a long time. At a minimum dragging its feet and holding it back in a worse way that even I had guessed.
 
Back
Top