Unigine "Heaven" DirectX 11 Benchmark

Well, it's not working on my PC. I'm having a driver error with my ATI 5770 card, on Win7 x64.

Engine:init(): clear video settings for "ATI Radeon HD 5700 Series 8.660.6.0"
ALSound::require_extension(): required extension
AL_EXT_LINEAR_DISTANCE is not supported.

I get the same problem running XP 32, exactly the same error message.
I downloaded and installed the latest OpenAL, rebooted but it makes, no difference.
Am running GTX260 with 191.07 WHQL video driver.
Very different setups, so it looks like it is hardware and OS agnostic.
 
TEsted out this benchmark clocked and stock and i gotta say, its rather very gpu dependent.

there isnt many gpu intensive "benchmarks" around :)

DX11 comming waaaaay faster than DX10, now just to have some 5870's to play with myself, waiting for the 2gb's.
 
It looks great to me.
Why is it these demos always look better than the games we play?
I'd love to play games that had that much detail and ambience.

When the average game looks like this then I think we'll have arrived.

That's the thing about all these DirectX updates. It's going to be awhile before
we see it really leveraged. If you basically build a DX9 game and throw in a few DX11 effects it's kind of hokey. I see a lot of that in games these days.

:D
 
I just hope rage uses DX11, and well i hope you all are aware of ID software being owned by bethesda software, or the parent company that they made tho.
MY point is, Elder scrolls using ID engine like ID's engine for rage :|:| that will be slick! :D
 
I just hope rage uses DX11, and well i hope you all are aware of ID software being owned by bethesda software, or the parent company that they made tho.
MY point is, Elder scrolls using ID engine like ID's engine for rage :|:| that will be slick! :D

Every idTech engine uses OpenGL as the primary renderer and D3D Hardware rendering has never been an option. The only exception to this are the Doom3 engine based xbox/360 ports.
 
I hope devs don't think that those roads look good. Other than that it was ok, with amazing shadows.
 
Am I the only person who kept thinking of Goldeneye 64 when the music first kicked in? (about :26ish in)

It was nice and purty, though I'll hop on the bandwagon that thinks this doesn't seem to have much standout fanciness. Then again DX10 only really wowed me in Crysis with the lighting through trees, and I didn't exactly spend much time playing the game looking up in the air. And as much as I hope this gets optioned for another Elder Scrolls game, chances are they already have something of their own going, considering their development cycles.
 
It's not about how bumpy the roads are, it's about how much detail can be added to an otherwise flat surface in real-time with dynamic LOD. And yes, the performance hit from tessellation for this demo is large, but that's because it's purposefully over-done. I'm sure that future games using tessellation will include a slider to determine the extent to which surfaces are subdivided. It's exciting that we'll finally have performance-friendly "displacement mapping" that performs well with shadows, SSAO, and anisotropic filtering :)
 
00000.jpg
 
I walked around in the demo checking out tessellation on/off.

I was under the impression that the added triangles were generated by hardware only. In this demo it almost looks like they are programmed.
 
Without side by side comparisons I dont think we will see alot of differances. I can not run the DX11 version but from looking at the video and from running the DX10 version I did notice the the dragon looked very differant and did not have any of the large body spikes.
 
It looks so damn good! I spent 20min walking around and checking everything out. Was pretty smooth at 1080p with 4X AF on one 5850. I would definitely buy another for xfire if games would use this.

Unigine
Heaven Demo v1.0
FPS: 30.7
Scores: 773

Hardware
Binary: Windows 32bit Visual C++ 1500 Release Oct 22 2009
Operating system: Windows 7 (build 7600) 64bit
CPU model: Intel(R) Core(TM) i7 CPU 920 @ 2.67GHz
CPU flags: 3600MHz MMX SSE SSE2 SSE3 SSSE3 SSE41 SSE42 HTT
GPU model: ATI Radeon HD 5800 Series 8.670.0.0 1024Mb

Settings
Render: direct3d11
Mode: 1920x1080 fullscreen
Shaders: high
Textures: high
Filter: trilinear
Anisotropy: 4x
Occlusion: enabled
Refraction: enabled
Volumetric: enabled
 
I walked around in the demo checking out tessellation on/off.

I was under the impression that the added triangles were generated by hardware only. In this demo it almost looks like they are programmed.

its hardware accelerated, but to utilize the feature properly, and to deform it to the extent of the dragon, i believe you need to create a custom height map for the textures you're modifying...like relief mapping sort of. but way better because its efficient and yet more robust. but anyway its up to the dev to implement, and in this instance it was done with much thought and effort to generate hype :p
 
All you folks running i7's @ 4GHz and 5800 series graphics cards but for some reason are not enabling at least 4xAA and 16xAF should be ashamed of yourselves. :p
 
its hardware accelerated, but to utilize the feature properly, and to deform it to the extent of the dragon, i believe you need to create a custom height map for the textures you're modifying...like relief mapping sort of. but way better because its efficient and yet more robust. but anyway its up to the dev to implement, and in this instance it was done with much thought and effort to generate hype :p

Thanks for the response.

I've gone and done some reading and that seems to be the explanation.

I guess it is impressive, but it sucks that we still have to depend on someone doing their job properly to reap the benefits of the feature.
 
FPS will prob go up a bit more once ATI gets their driver hax out for it.

I also found it interesting that the clouds below the city are seemingly harder to render than the city itself. :confused:

Just looking up and away from the clouds alone is enough to skyrocket my FPS from 60 to 200.
 
Also the engine doesn't seem to occlude hidden geometry in this benchmark. Blocking off the entire view with a house or something didn't affect FPS.

I'd guess it's purposely more stressful than a real gameplay scenario would be.


Can someone also take an FPS comparison between tessellation on / off? I dont have a DX11 card. :(
 
Am I way off in saying that this doesn't seem like a significant jump ahead? I haven't seen any effect that I haven't seen before. I also didn't notice much of any difference in DX10. So what's the real expense-justifying value over DX9 SM3 ?
 
Also the engine doesn't seem to occlude hidden geometry in this benchmark. Blocking off the entire view with a house or something didn't affect FPS.

I'd guess it's purposely more stressful than a real gameplay scenario would be.


Can someone also take an FPS comparison between tessellation on / off? I dont have a DX11 card. :(

Ask and ye shall receive!

Both tests run on the sig rig and 3x SLI GTX 280's. Performance for the DX11 verson was good but a a lot choppier than the DX 10 run. Also this engine seems to love SLI. Did test single card performance but my SLI inidcators were at max in both runs. So this engine looks to be multi-gpu friendly. AA was off.

unigine_20091024_2253.jpg
unigine_20091024_2259.jpg
 
Well, it's not working on my PC. I'm having a driver error with my ATI 5770 card, on Win7 x64.

Engine:init(): clear video settings for "ATI Radeon HD 5700 Series 8.660.6.0"
ALSound::require_extension(): required extension
AL_EXT_LINEAR_DISTANCE is not supported.

Having the same problem........on a 5850.....
 
Thanks for the response.

I've gone and done some reading and that seems to be the explanation.

I guess it is impressive, but it sucks that we still have to depend on someone doing their job properly to reap the benefits of the feature.
But that's always been the case.


Where are the 2560x1600 benchmarks?

Mine from rig in sig.

Vsync disabled
Unigine

Heaven Demo v1.0

FPS:25.4
Scores:640

Hardware

Binary:Windows 32bit Visual C++ 1500 Release Oct 22 2009
Operating system:Windows 7 (build 7600) 64bit
CPU model:Intel(R) Core(TM)2 Quad CPU Q6600 @ 2.40GHz
CPU flags:3001MHz MMX SSE SSE2 SSE3 SSSE3 HTT
GPU model:NVIDIA GeForce GTX 280 8.16.11.9107 1024Mb

Settings

Render:direct3d11
Mode:2560x1600 fullscreen
Shaders:high
Textures:high
Filter:trilinear
Anisotropy:16x
Occlusion:enabled
Refraction:enabled
Volumetric:enabled

------------------------
Vsync enabled

Unigine

Heaven Demo v1.0

FPS:21.2
Scores:533

Hardware

Binary:Windows 32bit Visual C++ 1500 Release Oct 22 2009
Operating system:Windows 7 (build 7600) 64bit
CPU model:Intel(R) Core(TM)2 Quad CPU Q6600 @ 2.40GHz
CPU flags:3001MHz MMX SSE SSE2 SSE3 SSSE3 HTT
GPU model:NVIDIA GeForce GTX 280 8.16.11.9107 1024Mb

Settings

Render:direct3d11
Mode:2560x1600 fullscreen
Shaders:high
Textures:high
Filter:trilinear
Anisotropy:16x
Occlusion:enabled
Refraction:enabled
Volumetric:enabled
 
lol i have a 9500GT 512mb DDR2 DX10

8 FPS, DX9
3 FPS DX10
2 FPS OpenGL

with everything on high.
 
Anyone else creeped out by the town?

this reminds me far too much of the myst games
Yeah, I felt the same way--the ambience would be perfect for a survival/horror or mystery game. Gorgeous!
 
Ask and ye shall receive!

Both tests run on the sig rig and 3x SLI GTX 280's. Performance for the DX11 verson was good but a a lot choppier than the DX 10 run. Also this engine seems to love SLI. Did test single card performance but my SLI inidcators were at max in both runs. So this engine looks to be multi-gpu friendly. AA was off.

Funny how you can run in Dx11 without a Dx11 card.
 
The Benchmark reports the GTX280 for some reason, I'm rendering through the ATI card. Strange. I think it finds the GTX280 as the first card?
 
That vid was sick. If that was ever made into a game, I'd play it.
 
simply amazing, I am def getting another 5870 when these games start rolling out... but i must wonder.... is my cpu the bottleneck here? Where is the line drawn on knowing if ur cpu is slowing down your 5870? Which cpu and what not, any of ya know?

FPS: 31.7
Scores: 798

Hardware
Binary: Windows 32bit Visual C++ 1500 Release Oct 22 2009
Operating system: Windows 7 (build 7600) 64bit
CPU model: Intel(R) Core(TM)2 Duo CPU E8500 @ 3.16GHz
CPU flags: 3318MHz MMX SSE SSE2 SSE3 SSSE3 SSE41 HTT
GPU model: ATI Radeon HD 5800 Series 8.660.6.0 1024Mb

Settings
Render: direct3d11
Mode: 1920x1080 2xAA fullscreen
Shaders: high
Textures: high
Filter: trilinear
Anisotropy: 8x
Occlusion: enabled
Refraction: enabled
Volumetric: enabled
 
Its worth clocking it a bit to see, I definitely see good gains while gaming with a fast CPU (4GHz+) on a GTX260.
Faster cards require more CPU so you will hopefully get some return.
 
Wow. Major flashbacks to MYST !

Is this a real game in any way?
 
The Benchmark reports the GTX280 for some reason, I'm rendering through the ATI card. Strange. I think it finds the GTX280 as the first card?

Ha!

That's AMD's secret!
They made a "real" 295 with two Nvidia 280's!

It's industrial espionage I tell you!

:D
 
I can not get this to run keep getting NOT RESPONING after 1-2 min of nothing.I HAVE 2X4870 W I7 920 on a X58 motherboard and im not sure what the problem.Is anyone else haveing issues getting this to run?
 
Back
Top