Square Enix, Microsoft, NVIDIA Show DX12 Demo At Microsoft BUILD

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
During the keynote address at Microsoft BUILD 2015 earlier today, Square Enix gave a demonstration of the power of DirectX 12 on a system running four NVIDIA GeForce TITAN X video cards. Take a look.
 
I have been playing video games for almost 35 years, and one thing I have learned is to not get excited about tech demos and dx version changes. It's going to be a long time before 1. mainstream graphics cards can render these scenes and 2. developers make use of the improvements in the latest dx version. Sometimes the features of dx are never even used in lieu of better or newer technology.
 
Now if they could just work on getting audio, especially speech, to sync up correctly with the animations...
 
Now if they could just work on getting audio, especially speech, to sync up correctly with the animations...

To be fair it's Square Enix the animations are gonna be sync'd to the original Japanese dialogue more so than the dubbed English.
 
It's a nice looking demo, but it appears to be locked at 30fps. With early drivers and possibly unoptimized engine, it still doesn't matter because the number of users for a game that requires 4 Titan X cards to run at 30fps can be counted by an average Sesame Street watcher.
 
I have been playing video games for almost 35 years, and one thing I have learned is to not get excited about tech demos and dx version changes. It's going to be a long time before 1. mainstream graphics cards can render these scenes and 2. developers make use of the improvements in the latest dx version. Sometimes the features of dx are never even used in lieu of better or newer technology.

Except in this case just about every game currently in support is planned to update to DX12 simply for the massive boost in FPS a lone. So it's worth getting excited about, especially scaling for multi GPU/CPU and independent GPU memory usage alone.
 
A big takeaway for me was the fact that they are using 8k textures and the titan has 12gb onboard memory which brings me back forever ago when Carmack said that we won't be approaching photo realistic gaming until graphics cards have over 10gb of memory standard. I know someone can find the quote for me as I'm too lazy to look it up right now. The simple ability for game developers to utilize such a substantial amount of memory for pure hi res texture goodness is the single most important factor in advancing our 'concept' of next gen graphics. The time the next gen consoles hit the market might finally be when we can really lean back in our chairs and say wow this is what i've always wanted to see.
 
The important thing that I gleaned from this demo is that Square Enix has partnered with Nvidia instead of AMD. :)
 
A big takeaway for me was the fact that they are using 8k textures and the titan has 12gb onboard memory which brings me back forever ago when Carmack said that we won't be approaching photo realistic gaming until graphics cards have over 10gb of memory standard. I know someone can find the quote for me as I'm too lazy to look it up right now. The simple ability for game developers to utilize such a substantial amount of memory for pure hi res texture goodness is the single most important factor in advancing our 'concept' of next gen graphics. The time the next gen consoles hit the market might finally be when we can really lean back in our chairs and say wow this is what i've always wanted to see.

Well if the rumors about DX12 are truly, the Windows 10 OS sees those Titans as 48GB of system VRAM instead of 12GB in DX11.
 
Why would anyone give a crap about anything running on $4000 worth of video cards....? Why not just show it running on a $50,000,000.00 ILM rendering farm...same difference.

Show us what DX12 can do on a 970 for starters, so the rank-and-file computer user can know weather or not to GAF.....
 
Well if the rumors about DX12 are truly, the Windows 10 OS sees those Titans as 48GB of system VRAM instead of 12GB in DX11.

This won't be the case. Data needs to be shared by the cards and pulling comical amounts of it over PCI-E is suicide.

I suppose you could come up with some weird schemes to try and avoid this but that's another can of worms, problems, and complexities. This is up to the developer to manage now, as well.
 
Well if the rumors about DX12 are truly, the Windows 10 OS sees those Titans as 48GB of system VRAM instead of 12GB in DX11.

Please provide some info on this because that's just not how SLI works AFAIK even with DX 12.
 
Except in this case just about every game currently in support is planned to update to DX12 simply for the massive boost in FPS a lone. So it's worth getting excited about, especially scaling for multi GPU/CPU and independent GPU memory usage alone.
Games that were coded for DX11 aren't going to show much of an improvement, if any, on DX12, unless they were ones that were heavily CPU-bound to begin with, like a massive RTS or something with an absolute ton of physics, etc.
 
Anyone smell something fishy as he struggles at first in manual mode then suddenly quick panned perfectly in to her chin? These tech demos are so staged...
 
Why would anyone give a crap about anything running on $4000 worth of video cards....? Why not just show it running on a $50,000,000.00 ILM rendering farm...same difference.

Show us what DX12 can do on a 970 for starters, so the rank-and-file computer user can know weather or not to GAF.....

My thoughts exactly. So it looks amazing on FOUR Titan X video cards? I should hope so. Yeah I'm impressed with it, but show me what my single GTX 970 can do and I might get excited.
 
They would "give a crap" because some people have those video cards in their systems. There are many users on [H] alone that are running 4-way TITAN X. $4000.00 is not much money. It's certainly not as out of reach as people pretend on these message boards.
 
Meh.... am I the only one not impressed? To me it doesn't look THAT amazing or even slightly groundbreaking versus graphics from the last five or six years at least. Typical tech demo shit.... increased polygons and a nice blur and HDR effect plastered onto the scene.

And all of that took 4 Titans??? Jeebus jumpin jackrabbits....
 
what will be funny is when the game is actually released, the sli profile will not work for it.

or maybe that wont be so funny.
 
Please provide some info on this because that's just not how SLI works AFAIK even with DX 12.
http://www.tomshardware.com/news/microsoft-directx12-amd-nvidia,28606.html

"The source said that with binding the multiple GPUs together, DirectX 12 treats the entire graphics subsystem as a single, more powerful graphics card. Thus, users get the robustness of a running a single GPU, but with multiple graphics cards."

"Part of this new feature set that aids multi-GPU configurations is that the frame buffers (GPU memory) won't necessarily need to be mirrored anymore. In older APIs, in order to benefit from multiple GPUs, you'd have the two work together, each one rendering an alternate frame (AFR). This required both to have all of the texture and geometry data in their frame buffers, meaning that despite having two cards with 4 GB of memory, you'd still only have a 4 GB frame buffer.

DirectX 12 will remove the 4 + 4 = 4 idea and will work with a new frame rendering method called SFR, which stands for Split Frame Rendering. Developers will be able to manually, or automatically, divide the texture and geometry data between the GPUs, and all of the GPUs can then work together to work on each frame. Each GPU will then work on a specific portion of the screen, with the number of portions being equivalent to the number of GPUs installed."

SFR is nothing new, but I guess the way DX12 handles it is. In any case, take all this with a grain of salt until it has all been proven.
 
OK lets work this out...

Those 4 Titans were rendering that demo with a framerate that seemed to be about 40-60 fps, indicating that it probably could have done with a 5th Titan added to keep a fluid 60 fps.

So, if we take into account the performance and price of Titan, we are looking at about 4 to 6 years before a single GPU will offer the power of 5 Titan cards, at a semi-affordable price of say $400 - $500.

Taking that into account, we are looking at 7-9 years before consoles are able to pull the same tech demo off in real time.

I would hate to think what will happen to these numbers if AMD go under in the mean-time.
 
Anyone smell something fishy as he struggles at first in manual mode then suddenly quick panned perfectly in to her chin? These tech demos are so staged...

Seems like you're looking for something that isn't there, looking for a reason to be skeptical.

As with tech demos, there are preset camera angles and camera focus points. He clearly says that he's skipping over the camera position which shows the ground debris and moves on to the camera which focuses back on the woman character. It is highly likely then, that camera is set to bias toward the face, so that he can easily show the up close detail of the textures and shaders on the face and head.
 
Why would anyone give a crap about anything running on $4000 worth of video cards....? Why not just show it running on a $50,000,000.00 ILM rendering farm...same difference.

Show us what DX12 can do on a 970 for starters, so the rank-and-file computer user can know weather or not to GAF.....

it's not the same difference. at all.

this was (hypothetically) running in real-time on consumer-level graphics hardware that is available NOW. granted, most people cannot afford to buy 4 GTX Titan X's and the requisite supporting hardware, but that's not the point.

this is just a proof of concept at the moment, no one is saying "hey, all the next gen games are going to look like this on low-mid to mid-high ranged hardware!". but you have to prove that it CAN be done first; once that is done, THEN you can start optimizing it to run better and more efficiently, so that as the technology progresses, it CAN start to be played on lower-end hardware.

add to that the fact that major DirectX versions typically last for anywhere from 3 - 6 years, and the power of the hardware coming out during that lifetime will increase multiple times over (assuming it keeps going at around the same rate as it has been).

now, i'm not saying that everyone should start just gushing over this and expecting a miracle overnight, and obviously, we don't know all the variables in play with this demo, but if people like you guys who come in here and just shit on everything that's put out there were in charge, with your mindset, nothing would ever progress...we'd still be using a slide rule. :rolleyes:
 
I have to agree with others here, 4 titans! really? the only thing impressive about that is 4 titans. Im sorry but it just didnt seem that impressive to me. Now if the demo was running on a 480 or 5970, then i would be clapping my hands saying bravo bravo!
I am more amazed how good GTA V looks and runs on my rig, than a this demo running on a $10,000 PC. Any game for the next 5 years will be made for the current consoles anyway. We are stuck in this cycle for awhile, but GTA for me at least gives me hope for the games to come.
 
I think it’s a great time to be a PC gamer, with VR looming around the corner, I am super excited.
 
They would "give a crap" because some people have those video cards in their systems. There are many users on [H] alone that are running 4-way TITAN X. $4000.00 is not much money. It's certainly not as out of reach as people pretend on these message boards.

The arrogance in this statement is hilarious.

How do you know what other people can afford, what their expenses are, how much they make in a month. I suggest you keep statement like this to yourself instead of coming across like some entitled little kid with rich parents.

The average person that has a life is not going to spend $4000 on video cards you do know some people have better things to do with their time and money than to be playing video games all day.
 
For people who would be interested in the absolute bleeding edge, $4,000 is either not much money, or they should probably reevaluate their priorities. For everyone else, it's just a tech demo, move along.


The average person that has a life is not going to spend $4000 on video cards.

This is [H]ard, not [A]verage.
 
Why would anyone give a crap about anything running on $4000 worth of video cards....? Why not just show it running on a $50,000,000.00 ILM rendering farm...same difference.

Show us what DX12 can do on a 970 for starters, so the rank-and-file computer user can know weather or not to GAF.....

That won't render in real time. 4 Titan X's is probably about is good as it is possible to get for real time graphics.
 
That won't render in real time. 4 Titan X's is probably about is good as it is possible to get for real time graphics.

Speaking of real-time, what ever happened to caustic graphics and their real time ray tracing? Was that just a bunch of bullshit?
 
http://www.tomshardware.com/news/microsoft-directx12-amd-nvidia,28606.html

"The source said that with binding the multiple GPUs together, DirectX 12 treats the entire graphics subsystem as a single, more powerful graphics card. Thus, users get the robustness of a running a single GPU, but with multiple graphics cards."

"Part of this new feature set that aids multi-GPU configurations is that the frame buffers (GPU memory) won't necessarily need to be mirrored anymore. In older APIs, in order to benefit from multiple GPUs, you'd have the two work together, each one rendering an alternate frame (AFR). This required both to have all of the texture and geometry data in their frame buffers, meaning that despite having two cards with 4 GB of memory, you'd still only have a 4 GB frame buffer.

DirectX 12 will remove the 4 + 4 = 4 idea and will work with a new frame rendering method called SFR, which stands for Split Frame Rendering. Developers will be able to manually, or automatically, divide the texture and geometry data between the GPUs, and all of the GPUs can then work together to work on each frame. Each GPU will then work on a specific portion of the screen, with the number of portions being equivalent to the number of GPUs installed."

SFR is nothing new, but I guess the way DX12 handles it is. In any case, take all this with a grain of salt until it has all been proven.

DX12 doesn't handle it, that's the point. It's in the hands of the developer. Partitioning the scene like that is what I was referring to earlier, but how do you load balance it? It kinda just works in AFR, sorta. More elaborate schemes like this need more thought. You need to make sure one card isn't just snoozing because whatever the other got stuck rendering is more expensive.

You can potentially save memory but likely at the cost of raw performance. Is that worth it? Maybe, but it's not quite a free lunch.
 
Looks Plastic to me and NOT impressed they might have better get that demo to run on one card instead of 4! FUCK YOU, YOU CRAPPY POS DEV TEAM Find new work if you cant make it run on once card.
 
QFT. For many in this sour economy that are less fortunate, it can be as much as 25 or 33%.

And in 4 years this will be common place. If you don't want to be "less fortunate", change your circumstance. Of course it might require you to stop posting on [H] and really work, but it can change.

Maybe I should complain about not being able to have my own private jest because it would be more than 100% of my income. :D
 
Incredibly impressive regardless, the character looked like it was straight up out of a FF CGI Film. The environment and shadows less so, but the character and the clothing WOW.
 
It is highly likely then, that camera is set to bias toward the face, so that he can easily show the up close detail of the textures and shaders on the face and head.
Yea, that was my point, that it was not truly "manual". It's not a big deal, just a bit of grumpiness over tech demos like this as they are historically very inaccurate in showing what a system will eventually provide for an actual gamer.

That said, the quality is on par with what I expect from 4k worth of cards... which is not a huge jump in realism. The people complaining here about that don't seem to realize that each generation the jump is getting smaller and smaller, and as we muck around the uncanny alley it may even appear to get worse before it gets better.
 
Back
Top