New Unreal 3 Videos

chrisf6969 said:
But all new processors in the the future roadmaps are planning to have at least dual cores (2 processors) You'll be able to dedicated one of those processors to physics in a game. And a lot more likely to be used by game developers than a specialized add-in card, that requires drivers support, etc.

I know about dual cores, just what you said made no sense. You were referring to the amount of processes that could be simultaneously executed as threads. And btw a dual proc wont dedicate a core to just your game, it will timeslice with all the other apps running like a single core processor.

bonkrowave said:
^^ When did they change the name of a "bump" map to a "normal" map. I know bump maps are now called normal maps ... but why, and when did this happen ?

I was talking to a guy about getting some bumpy looking surfaces and I used the word bump map, and he was like ... what the hell are you talking about. And he promptly told me it was a normal map ?

I know bump maps are done with a grey scale. The value of grey determines how far the surface raises.

http://en.wikipedia.org/wiki/Normal_map
 
What I really can't wait to see, which I think would be fairly easy to do, would be to make it so that the farther away you are from something, the less detail you'd be able to make out. As it is right now, I can play HL2 and see somebody picking their nose across the stage (example). And maybe some decent fog effects.

I want to play DoD, (or any other game really) and see the shape of somebody on the horizon, but not be able to tell if they're axis or allied and hesitate to shoot.

And then maybe down the roads, some killer water/fire effects like the scene in Saving Private Ryan where it starts to rain on the leaf/ground/puddle and you see water fly all over the place when they run through the puddles.


EDIT: If you can't tell, I'm a fan of WW2 games/movies :p
 
Vulcanus said:
What I really can't wait to see, which I think would be fairly easy to do, would be to make it so that the farther away you are from something, the less detail you'd be able to make out. As it is right now, I can play HL2 and see somebody picking their nose across the stage (example). And maybe some decent fog effects.

I want to play DoD, (or any other game really) and see the shape of somebody on the horizon, but not be able to tell if they're axis or allied and hesitate to shoot.

And then maybe down the roads, some killer water/fire effects like the scene in Saving Private Ryan where it starts to rain on the leaf/ground/puddle and you see water fly all over the place when they run through the puddles.


EDIT: If you can't tell, I'm a fan of WW2 games/movies :p

well they already reduce detail in far away objects with LOD

but i think what you are talking about is maybe depth of field

the effect is very possible on current graphics cards

there are some tech demos out there that show this as well
 
Vulcanus said:
What I really can't wait to see, which I think would be fairly easy to do, would be to make it so that the farther away you are from something, the less detail you'd be able to make out.

do you mean emulating the human eye like 20/20 vision and less or more?(i may be off base here) that would be interesting. ive thought itd be cool for developers to pay attention to this more. like maybe if you choose a grunt soldier class, his vision won't be quite as sharp as a sniper class.
 
bonkrowave said:
^^ When did they change the name of a "bump" map to a "normal" map. I know bump maps are now called normal maps ... but why, and when did this happen ?

I was talking to a guy about getting some bumpy looking surfaces and I used the word bump map, and he was like ... what the hell are you talking about. And he promptly told me it was a normal map ?

I know bump maps are done with a grey scale. The value of grey determines how far the surface raises.

Bump maps are still used by some people for some things, but normal maps can be applied effectively to anything and give a better overall result so that is the way the industry is going. The developers of the Unreal 3 engine have stated that basically everything in the game will be normal mapped.

I won't go into the exact details of how normal maps work (mostly since I don't fully understand it myself), but an easy way to describe them is as a "realtime" bump map. They are very similar to bump maps, but instead of being grayscale they use multiple colors, and they are far more dynamic in they way they work with lighting.

The neatest thing that is typically done with them is this: Lets say you are making your main character in a game. To make a normal map you create two version of him/her, a 4,000 polygon version of him and a 200,000 polygon version (where even little lines in the skin are present). Then you have a program that overlays the detailed model on the simpler one and finds all the differences. It turns these differences into a "normal map". You then use the 4,000 poly model in the game with the normal map applied over it and voila, you have an image that looks like a ~100,000 poly character (some detail is lost) for the hit of only a 4,000 poly character.

This technique can be applied to everything in a level so you can have highly detailed buildings and enemies for a far lower hit on the system than you would expect. You can also manually paint normal maps, much like with bump maps, and you can convert bump maps into normal maps.

You can also compress normal maps so they take less space. That is what ATI's 3DC compression is primarily about. Currently normal maps are compressed with an algorithm designed for texture compression, and so the loss on a normal map really damages it's overall affect. So without 3DC the previous example might end with a ~20,000 polygon effect while with 3DC the final effect would be about equal to ~80,000 polygons.

Anyway, google "normal mapping" and you should be able to find out lots more about it.


 
I remember seeing it on a tech demo somewhere, so I know it's possible. But a game and a tech demo are totally different beasts. ;)

Also LOD = ??
 
The fourth video was the only new one, I saw the other 3 (well, video recordings of the other 3 anyway) last year after they showed them at a developers conference. Those older recordings have far better color so it was a little more impressive. I would really like to see the 4th video with decent coloring.


 
Vulcanus said:
I remember seeing it on a tech demo somewhere, so I know it's possible. But a game and a tech demo are totally different beasts. ;)

Also LOD = ??

LOD means "Level of Detail"

In a game you want everything to be highly detailed so you of course design everything with plenty of polygons. The only problem is that if your players PoV is on a hillside looking over a large city, and everything in that city is being rendered in full detail then even the most powerful PC in the world will be brought to it's knees.

So instead what you do is setup the game engine so as the distance from the player increases the polygons used by each object decrease. So a tavern all the way on the other side of the city is maybe 10 polygons, basically just a dot in the distance. Then a market stall in the middle of the city is say 100 polygons, and a sign on the hillside with the name of the city on it is 1,000 polygons. Then of course as the player enters the city and approaches the stall it slowly gets more detailed, and if the player turns around and looks the sign gets less detailed. Ultimately once infront of the stall it will be using it's full 3,000 polygons, and the sign will just be about 20 polygons off in the distances.


 
Since devs are already using the URE3 in Xbox2 games, when are we gonna see the first PC game us it?
 
There is some unannounced title being worked on. Supposedly, the original tech demo was using assets from that game.

M
 
Watching most of those very same tech demos last year at the GeForce Lan was nothing
short of amazing. What I look forward to the most is the improved bump mapping. The
bricks of the catacomb wall jumped out at me.

As far as any game "inovation" goes, the more life like the screen looks, the more I get pulled
in. Sure the graphics right now are pretty (ie Far Cry + SM3) but to me they still seem "platicy" or over
done but with better lighting and the character assets I saw in Epic's demo the possibilities
of a game like BF2 or COD, Doom3 even, the overall gaming experience is gonn reach new heights.

So I'm gonna break out that Special Edition disk from my UT2K4 Special Edition box and
and try to prepare myself for this evolution. I'm sure I'll still be trying to get my head above
water by the time a game actually ships.

Oh and as far as system requirements went...the Unreal Engine 3 demo was running on
a 6800U. And with SLI, Dual Core CPU's, these new PPU's (SLI these also) and were
in for some really good shit.
 
Lord of Shadows said:
I know about dual cores, just what you said made no sense. You were referring to the amount of processes that could be simultaneously executed as threads. And btw a dual proc wont dedicate a core to just your game, it will timeslice with all the other apps running like a single core processor.



http://en.wikipedia.org/wiki/Normal_map

Ahhh I see ... so as where bump maps were limited to creating different surface heights in one direction, normal maps allow you to have surface details in all three directions .... so effectively you can have bump maps on top of bump maps, heading in a differnet direction.

Well thank you for clearing that up.
 
So if you have a flat surface, the line perpendicular to that surface is the normal. I think that the 3 channels of the normal map designate the x,y,z angles of the normal.


Cool stuff!
 
Bump maps - grey-scale images that define the hight of a surface without adding any geometry. Only defines one dimention. Object looks flat if seen from any other angle other than head on. Doesn't react to lighting.

Normal maps - RGB, each color defines a demention: X,Y,Z. Object looks great from any angle, doesn't look flat. But still has faceted profile.

Parallax Mapping - Same values and a normal map, but also displaces the surface of the object visually on a pixel level. So the profile of the mesh doesn't look faceted, like a normal map.
 
While the videos do look good, I have to say I do not like "shiny" everything looks. I call it the "Glossy Plastic Effect." For example look at pics from EQ2 or Elder Scrolls 4 and see how everything looks like it was dipped in a vat of gloss. This takes away from the immersion for me. (EQ2 is really bad about this, the characters look like they are made out of plastic). However I read somewhere that this is just lazy programming and that it is possible to tone down the "shinyness" of an object. (I think this was something in the list of things in an UE3 model in a previous post.) F.E.A.R. also suffers from this.

Look at HL2, it had amazing graphics but did not have the "Glossy Plastic Effect." I liked the "dull" look of the run-down city. Hopefully more games will follow this lead.

Also what is the difference between Parallax Mapping and Displacement Mapping? Did they just change the name?
 
MH Knights said:
Also what is the difference between Parallax Mapping and Displacement Mapping? Did they just change the name?

Displacement mapping actually displaces the mesh via a 2D grey-scale image (or any other 2D source **proceduaral images**).
 
Hrm, not to piss in the proverbial wheaties, but....

WHY can't they release REAL (not realplayer) videos of this? The "cam on a projected screen with some schlub talking in the background" is majorly annoying. Yes, its cool, and the game will be killer for sure.

/rant

I wonder what this baby is running on....68k Ultra SLI?

edit: Check check, steve, can you turn down the top end, I'm still getting a little ring back here....
 
MH Knights said:
While the videos do look good, I have to say I do not like "shiny" everything looks. I call it the "Glossy Plastic Effect."
Look at HL2, it had amazing graphics but did not have the "Glossy Plastic Effect." I liked the "dull" look of the run-down city. Hopefully more games will follow this lead.


I am with you on that thinking. I don't like the shiny effect either. Valve did a good job in using it where it should(IMO) use it. Like in CS:S Office. The garage has the shiny/dirty look to it, like a real garage would. It looks as if the garage was painted over numerous times without removing the previous layer of paint....giving it that build up of too much paint look. Just like a garage would be in that context.

It can either take away or add to the realism of a game. Valve (IMO) did a good job of adding to the realism with that particular effect.
 
Valve used environment maps for that effect, unreal3/doom3/theif3 (Hey all threes) shininess is a byproduct of the realitively simple specular highlight system. Hl2's lighting system is similar to that of quake3's. The lighting is prerendered and output to image files (lightmaps) and blended with the map textures. An environment map is pretty much an alpha overlay that has coordinates set differently.

I personally like the effect gives a surreal quaility to everything and it just doesnt look like plastic to me.
 
Yea, these videos are crap. And the Intel RoboHords videos on GameSpot, what's up with that ... I want dark gothic slimy flying creatures, not some sort of crappy Zorg GameCube-ness. What a poor example of Unreal Engine 3!

Edit: Noticed that these are not Epic demos, but Intel.
 
Chams said:
Looks amazing I only wish we could get some high res bink videos. :cool:
Yeah...I have the first fideo in Bink format...but not the other three...anyone know if they exist???
 
I just an e-mail from some co-workers that the demo was running on two 512mb cards in SLI. That sucks... because that's probably about $1400~$1600 in hardware.
 
Well i'll say this, when that engine comes out its DEFINITELY time to make this my backup rig :) My gpu might be fine, but with all the new physics and most likely improved AI, my cpu will get on it's knees and beg for mercy :(
 
hehehehe..peep this,

So when UT2k3 came out (the demo) I was still running a GF3 Ti500 ...when I got it...it was the shizzy! Soooo, by this time it's past it's prime but still usable. So I fire up UT2k3 for some fraggin fun....and about 6 frags into the DM, the frame rate starts stuttering. at first for like a sec or so, i thought it was just normal hangup. Then it was like 3 seconds, then 8 seconds of freeze, 2 of movement and so forth. I paused and said...thats wierd.....and then proceeded to crack open the case. I was greeted with a blast of hot air.

Touch the CPU HSF, fine,
Touch the NB HSF, fine
Put a finger on a ram chip, warm..but fine
Put my finger on the ti500 ram sink OUCH!!! OMG WTF that is teh sux0rs burned with instant blister. The card was fried after that...i think a VRM blew and it was off to the albatross (correct me and die) ti4200p turbo which i still proudly display in my server @ 4600+ speeds.

HEHE
 
Looks cool, It seems like this will be the first game to use SM3.0 alot... to bad most cards can't handle all the other effects... Hopefully the R520s and the g80s with there (supposed) Dx10 (WMG1.0) support wll be ableto handle all the effects.
 
Burning water. These are video games. We can do anything. I want to see realistic burning water.

Water on fire > Fire > H2O
 
That is one thing I forgot to mention in my earlier post. I really think that the particle system needs to be overhauled, or at least refined. I am totally bummed that modern graphics engines still use sprites for mist and steam effects.
 
What? You want true volumentrics? It will kill performance. Sprites are great. You can have a fake volumetric that is sliced up on multiple sprites.
 
Back
Top