I don't get Lens Flares and Depth of Field

Joined
Jan 3, 2009
Messages
645
I mean, when you are recording video, you would typically want to avoid getting a lens flare on your recording right? It's not like the naked eye gets them, so then why do so many video games not only add them, but consider it a graphical feature that you can turn on to make the graphics "better"? If it's something you don't experience naturally and would want to avoid in filming, I don't understand why one would deliberately want to invoke it in a videogame, especially with videogames trying to shoot more and more for realism nowadays.

As for Depth of Field. Ok, I realize that your eyes can only focus on one thing and everything else that is basically in your peripheral vision is blurrier. But that's more a limitation of the eye, why would you want to deliberately invoke that limitation? Especially since with your eyes it's not an issue since when you look at something else your focus changes to that and it's no longer blurry, a videogame isn't exactly tracking your eyeballs so it knows to un-blur wherever you are now looking at. I mean, at least it makes a little more sense than lens flares since it's trying to replicate a natural eye limitation, and people say it invokes a sense of depth but really to me it feels like it adds unnecessary blur. (Well... and I have a 3D setup if I want depth...).

Most confusing of all though would be BOTH. I mean, Depth of Field is trying to replicate a limitation of the human eye, Lens Flares is trying to replicate a (generally unwanted) limitation of a camera lens. How does it make sense to have both?

Normally I try to max out even setting, but I normally turn both off even if my system can easily handle it because I don't like it. I am not trying to argue against them or anything, just confused why people would actually want these effects in the first place, adding essentially a camera limitation and blur.
 
The idea is to make the scene look like film, not the human eye. I think it's great for cutscenes, but during gameplay it's more of a "please use subtly or not at all" effect. A TON of art direction goes into effecting and shaping gameplay. A single light can mean the difference between 9/10 testers completing a mission quickly and 5/10 quitting in frustration. DOF can be used on the rear of your weapon during iron-sight aiming to suggest to you to keep your eyes forward. Lens flare can be used to draw more attention to a certain over right object. But when used excessively, it just ruins the game.

The one thing I wish games would do better is motion blur. One of the main things that can actually make a game look sharper ends up making it look like rubbish. If done correctly, motion blur can actually make your FPS seem higher.

I'm by no means saying motion blur is good in games now. Turn that shit off.
 
Yeah, I tried a racing game with motion blur on and off. My thoughts were "This just makes it harder to see the track and opponents, why would I want this?"
 
why does DICE insist on trying to blind you with gamma on some maps,

answer.

they are stuck up little prick artists that think they know what makes a game look pretty. even tho they are idiots.
 
Yeah, I tried a racing game with motion blur on and off. My thoughts were "This just makes it harder to see the track and opponents, why would I want this?"

I hear you loud and clear. If you take a super-sharp action-packed blu-Ray scene and pause it, it looks blurry as shit. The motion blur was filling in the gaps between the frames to make motion easier to follow, and objects glide sharply. Then you look at how games do it and it does the opposite.
 
I tun off DOF as well, pointless 'feature'.
Lens flare doesnt bother me though, but I agree, its not needed.
I dont like motion blur unless there are framerate issues, like in Watch Dogs.
 
DOF can make a scene seem more "cinematic" but honestly, it's never done right and makes the whole game run slow.
 
I hear you loud and clear. If you take a super-sharp action-packed blu-Ray scene and pause it, it looks blurry as shit. The motion blur was filling in the gaps between the frames to make motion easier to follow, and objects glide sharply. Then you look at how games do it and it does the opposite.

watch dogs and crysis do it right, you just need to tone it down a bit in crysis. subtle, high sample, per object motion blur is great.

depth of field is alright as long as it's used properly, such as during cutscenes where you're focusing on characters talking or when you're aiming a weapon.
 
Yeah I agree, it's stupid, I almost always turn that crap off.
The one thing I wish games would do better is motion blur. One of the main things that can actually make a game look sharper ends up making it look like rubbish. If done correctly, motion blur can actually make your FPS seem higher.
I haven't met a game that did motion blur in a way that made the game feel like the FPS was higher. Motion blur is one of those things I don't get, your eye blurs frames together itself, your brain is more than capable of discerning the difference between your eye blurring a smooth but fast moving scene and discreet images that are blurred, especially when you are in control of the movement (movies get away with it slightly more because you aren't in control of the camera movement... but even then it doesn't look great to me).

I remember this coming up when the first Crysis came out, with people lowering their expectations from 30+fps to 20+fps because apparently the motion blur was so good that it felt like higher FPS... it didn't, at least not to me, I think people were just consoling themselves that their rigs couldn't get more than 25fps in the game. When you actually play it at 30+fps you realise how crap 25fps was and that motion blur wasn't sufficient to hide it.

At least that's my opinion, I haven't met the game that had motion blur that made me feel like the FPS was higher than it actually is in reality. If anything, it highlights framerate drops to me because I can clearly see the discreet blurred images and they look ugly as hell.
 
I hate lens flare in games and always disabled it when possible.

As to why there there; because it's trying to mimic how you have experienced things like it before; via cinematic things like movies.
 
your eye blurs frames together itself

nooooooooooooooooooooooooo

you need either a near infinite framerate or high quality motion blur to make video games look like real life because real life is a constant stream of information, games are not and neither are films or anything else played back on a computer. there is no information inbetween frames. films get around this by increasing shutter length to blur adjacent frames together giving the illusion of smooth movement (fake information.) if you used an extremely short shutter length to film something at 24 fps, it would look insanely choppy and terrible because it's missing a lot of information that our eyes would normally see, just like a video game would/does. this is why high quality, proper motion blur is needed in games, because we're not at the point yet where we can do it the best way possible (extremely high framerate source with an extremely high refresh rate display.)
 
I actually prefer lens flare in some games. I thought Crysis 3 did it perfect. At first I turned it off cause I was trying to save a few frames but thought it looked a lot better with it on. I'm kind of a sucker for eye candy though so little "gimmicks" like that are up my alley.
 
I have motion blur in BF4 set to 20%. At that level it's about like the motion blur in Half Life 2, which I thought was a tasteful level of motion blur.

One of the best examples of Depth of Field I've seen was in Super Mario 3D World. It added to the sense of scale for some reason.
 
nooooooooooooooooooooooooo

Sorry I wasn't clear, I didn't mean computer generated frames so much as I meant the real life images that are the result of light hitting the retina. I didn't mean your eye blurs together ugly arse 24fps images together... I mean your brain can tell the difference between a smooth constant stream of fast moving visual information that you interpret as real life motion blur and the crappy low FPS computer generated motion blur.
films get around this by increasing shutter length to blur adjacent frames together giving the illusion of smooth movement (fake information.) if you used an extremely short shutter length to film something at 24 fps, it would look insanely choppy and terrible because it's missing a lot of information that our eyes would normally see
Err, films at 24fps DO look insanely choppy when things are moving fast despite the shutter speed. You see it in jerky panning shots and when action is fast you just see the discreet blurred images (at least I do, and I don't think I have some sort of fast-seeing super power).

It's worse in games because you actually have control over the camera, so when you pan, you know what you SHOULD be seeing... and you don't. Also I don't know about you, but I tend to move my mouse significantly faster and in more precise yet weaving patterns than a movie camera will typically do (unless the movie is intentionally trying to make you see nothing by moving the camera too fast).

I'm no doctor, but I think the reason is because the way real life motion blur works is that you have a continuous stream of information hitting the retina and it remains on the retina for around a 10th of a second, so you have a 10th of a second worth of information smeared across your eye, with things that happened 0.1 second ago appearing more dull than things that happened 0.01 seconds ago. It's all a continuous stream of data. When you take something with a finite framerate and try and apply motion blur, instead of having a constant stream of 0.1s of data on the retina, you have discreet image imprints that just happened to be blurred within themselves... it does not look the same.

At least that's my understanding, maybe it goes deeper than just how the retina works, all I know is I am yet to meet a game where I thought motion blur made low FPS not feel like low FPS. I remember going through this whole discussion when Crysis 1 came out and people were trying to tell me how 25fps is fine because of the motion blur and I tried a plethora of settings and it always still looked like shit to me. Regardless of the motion blur settings it still always felt like 25fps and in the end I was most content with it off or just set very low. Motion blur isn't the first thing I turned off in a game, but it is definitely high on my list of things I turn off.

EDIT: Updated several things.
 
Personally, Lens Flare don't bother me too much but that's probably most of the time that I do see it it wasn' obstructing.

Depth of Field makes a game looks amazing when done right, but i hate having it on extremely high. It kills the eyes.
 
I think in a 'game' situation, motion blur will never work miracles. Take for instance: if your game is running at 60FPS with 1/60th of a second of motion blur applied, it should look amazing. But if you are running at 25FPS and have 1/25th of a second of motion blur, it will look messed up from the players perspective. Non-interactive stuff can still benefit, but you nailed it when you said that 25 FPS is horrible blur or not when the player controls the camera.
 
why does DICE insist on trying to blind you with gamma on some maps,

answer.

they are stuck up little prick artists that think they know what makes a game look pretty. even tho they are idiots.

QFT.
 
I think in a 'game' situation, motion blur will never work miracles. Take for instance: if your game is running at 60FPS with 1/60th of a second of motion blur applied, it should look amazing. But if you are running at 25FPS and have 1/25th of a second of motion blur, it will look messed up from the players perspective. Non-interactive stuff can still benefit, but you nailed it when you said that 25 FPS is horrible blur or not when the player controls the camera.

doesn't take a genius to figure that out. 25 fps with proper motion blur is still better than 25 fps without, and that's a fact.
 
Lens flare is a natural result of light passing through glass with the right flaws at the right angle. When I see them in real life, I know it's time to wash my glasses. :p

It makes sense to have lens flares in games when you're looking through glass, like in a vehicle or if you're wearing a helmet, etc. But yeah, I'm pretty sure you can't ever see lens flares with the naked eye, and if you do it's time to go see a doctor. So yeah, modern games are using them way too much.

Same with depth of field. Modern cameras can render beautiful bokeh (blurry areas) that are out of focus in a camera-shot and add artistic abstraction to photos and movies. Games try to mimic this so they look more artistic, photo-realistic, and cinematic. It definitely makes sense to use depth-of-field effects in cut-scenes in games, and almost even makes sense to use it when you're looking through a scope or something with very specific focus in-game. But when they start using it in-game and especially when it starts to obscure vision in-game and make it harder to play, they're definitely doing it wrong. Depth of field shouldn’t be something that just gets used to hide lazy design and texturing in the middle of the game.
 
It's odd to me that people would complain about lens flares not being physically correct while happily playing games that have completely incorrect lighting, with surfaces that have an order of magnitude more specularity than they exhibit in the real world (and this is somehow a wonderful thing) and with shadows that have completely consistent hardness across the entire scene.

All the while there's some sort of HUD that's being projected onto the screen in a way that either doesn't exist or doesn't make sense, they're running three times faster than the very quickest Kenyan sprinters and they're absorbing forty large-caliber rounds before they even consider keeling over.

Somehow, though, the implausibility of lens flares is a key sticking point.

why does DICE insist on trying to blind you with gamma on some maps. they are stuck up little prick artists that think they know what makes a game look pretty. even tho they are idiots.
Those 'stuck up little prick artists' develop games that move millions of units annually. Clearly, consumers are indicating that they absolutely want what those artists are doing. I suspect, despite your complaints, that you're among them.
 
I agree, most times these don't add much to the game.

In the case of motion blur I really don't see the point at all, while most of us are doing what we can to get the least amount of motion blur possible, adding it as an gfx option seems silly.

DOF I think has some moments where it does add to the cinematic effect, but I find just as many times that it adds to the game it takes away. I think with some work and some subtly this can be good.

Lens flare, meh. I can take it or leave it.
 
Recently played through Metro Last Light - thought the lens flare in that game was amazing. The contrast it provides from the relatively dark backgrounds and how it shifts in appearance depending on your viewing angle made it very immersive.

On DoF, I tend to think that dynamic DoF looks awkward - the game doesn't really know where you're looking. In cases of iron sights it works out okay, but any of the implementations that are based on assuming you're looking exactly where your center of view is at feel strange. Cutscenes can be a mixed bag.

When modding Skyrim instead of using dynamic DoF I used a fixed DoF. Set the range so that it blurs the peaks of mountains and distant trees - hiding the low resolution stand ins giving a much softer feel and greater sense of scale to the scene. It also allows your eye to focus on relevant, closer objects more easily.
 
I don't get lens flares when I look at the sun in real life, so why should I get lens flares when I look at the sun in a game? Unless my character is wearing goggles or a face mask.
 
I don't mind flares in a single player game but depth of field is one of my biggest complains about games in recent years. I hate this movie crap in my games.
 
I don't get lens flares when I look at the sun in real life, so why should I get lens flares when I look at the sun in a game?

I know!! I usually get a solid bright light, then the dark of my eyelids with little sparkly bits, and then a slight burning sensation accompanied by a momentary head pain. Why aren't they simulating that?!?!?! :eek:

:D

As far as the topic. I hate blurring in games. I can suspend disbelief well enough to enjoy crisp details, while still being immersed. Blurring is a performance hit, not typically well executed, and most games aren't realistic enough as it is to warrant that extra bit when it could be spent on better poly counts, more interesting FX shaders, etc.

Lens flares... Either way is fine. I get what they're trying to do, and while it may be counter-intuitive realism-wise, I don't really care. It can add some cinematic effect, or it can look really stupid. I just tend to get used it either way.
 
Those 'stuck up little prick artists' develop games that move millions of units annually. Clearly, consumers are indicating that they absolutely want what those artists are doing. I suspect, despite your complaints, that you're among them.

those artists also, wanted the horrible blue tint in bf3, they don't sell more copies because of those crappy art decisions, they sell from marketing and gameplay. those art decisions are what makes the games lose long term players.
 
doesn't take a genius to figure that out. 25 fps with proper motion blur is still better than 25 fps without, and that's a fact.
This ^

Unfortunately, a computer would have to render a very large number of frames and average them together to produce a single frame with accurate motion blur. This also means you have to wait until all of your samples render before displaying the final multi-sampled frame (which could result in input lag).

Edit: Here's exactly what I'm talking about, done in post-production. Footage was captured at 60 FPS, and all of those frames were used to generate 24 FPS footage with multiple samples per-frame: https://www.youtube.com/watch?v=GpwoIyOgoJU

I wonder... could AMD / Nvidia implement multi-frame-sampling at the driver level? So that any time current FPS is higher the monitor refresh rate, all those extra frames are averaged together and displayed as a single frame instead of being discarded (or only partially displayed)?
 
Last edited:
Back
Top