Anyone else tired of "next gen" graphics?

no, deferred rendering looks no different at all.
it CAN make it look different (ie lots and lots of dynamic lights for cheap!) but there is no side effect that makes it look different by default.


the bloom in BC2 is just that, stupidly overdone bloom. nothing more.
the color scheme in the game is pretty boring and on some maps you'd actually be hard pressed to spot someone just because it's THAT BRIGHT.

http://screenshot.xfire.com/s/94578894-4.jpg bloom off
http://screenshot.xfire.com/s/94729369-4.jpg bloom on

DUMB either way, it's just bad art direction.


i think Crysis actually did a really good job. it had a ton of the usual suspects but managed to never overdo anything, instead it looked fucking good instead of washed out shit or blurry / greasy whatever.

Have you adjusted the brightness in the game? If turn it down a ways and it doesn't look like that.
 
no, deferred rendering looks no different at all.
it CAN make it look different (ie lots and lots of dynamic lights for cheap!) but there is no side effect that makes it look different by default.
Yes, it's just a faster way to render certain effects (mainly shadows) very quickly.
 
played god of war3? I never realized skin was so glossy!

If someone is constantly running (and that should be enough to answer this question), slicing people up, ripping wings off creatures, and countless other physical actions, would you expect that person to be sweaty? How would you represent someone covered in sweat? Sometimes you people make me wonder. :rolleyes:
 
the reason why its all shiny and blur these days is because its the only way to cover up bad graphics on console...

console graphics are horrible....its like playing a 2003 PC games or worse TBH..

You're an idiot.
 
You're an idiot.

Well, says the guy with nothing but consoles in his sig, so biased much? :rolleyes:

Yes, I have just a PC in my rig but I was a console gamer, I still may go back to it later on. I'll wait for the next-gen and see.
 
I'm so glad someone brought this up, unfortunately, all my friends are consoletards that think every new game has awesome graphics:rolleyes:
 
Well, says the guy with nothing but consoles in his sig, so biased much? :rolleyes:

Yes, I have just a PC in my rig but I was a console gamer, I still may go back to it later on. I'll wait for the next-gen and see.

I have built 4 PC's in my life-time, I currently own 1 PC and 1 Laptop. ;)
 
Don't forget that the original Bad Company was a console exclusive, and made heavy use of postfiltering/bloom/other shader effects.

I imagine that their goal was to improve the graphics without changing the overall look and feel of the game. The result is BC2, which actually looks quite nice on the PC but really overdoes it on the bloom/postfilter effects.

Once I beat the single player, I'll try mucking around in the .ini files and see if I can't make the game look better. Removing all of that bloom could probably give you an edge in online play!
 
Sometimes I miss having sprite graphics. Like the SNES version of mariokart, rather than all this 3d stuff.
 
My only question in regards to next-gen graphics is who the hell came up with the idea to make characters skin look like it is covered in plastic-wrap.
 
I know exactly what you mean. This is the deferred shading engine effect in use. Most of these "next gen" games are like the Unreal 3 engine, etc, and they are all hazy, blurry, have way too much exposure, bloom, HDR, and no sharpness. Everything is either way too dark or way too overlit. Nothing has sharpness really.

That's one reason why Modern Warfare is still a decent game, it's a heavily modified Quake engine and still has that sharpness and balance of lightning and colors to it.

Dirt 2 looks like crap, Rainbow Six Vegas 2 looks like crap, tons of console games look like crap.
Look @ Crysis 2 screenshot thread. It looks like crap. Too much yellow lightning, bluriness, etc. which looks like overcompensation for low polys and low textures.

http://hardforum.com/showthread.php?t=1510193

Bad Company 2 uses deferred shading as well I believe, but it's not as bad as the OP's screenshot. I do find the desert levels are far too overexposed and the lightning is too bright.

I guess in the sense that a car is based on the wheel, MW uses a heavily modified Quake engine.
 
Its just hit me why I love the AvP singleplayer campaigns - they're oldschool.
The characters are very detailed, animations awsome and graphics cool, but the presentation of it feels a little like the first half-life game for me
 
Bad Company 2 is attempting to simulate real combat effects. Building blows up you know there may be debris and dust that will affect your vision. I just don't understand gamers. Longing for more unrealistic games?? Makes no sense and comes off as complaining to complain. If DICE had made BC2 on the same engine as BF2 with tweaks there would be crying on this very forum to high heaven. For the record when ish blows up there will be dust to impair your vision. In the desert many times sun will impair your vision also. So in short what this thread teaches us is that PC gamers who always pride themselves with buying the latest and greatest video cards year in and year out and clown console fanboi's for their under powered gaming machine of choice would like Dev's to tone down the graphics?

I didn't read the entire thread so someone set me straight if I'm off base here.
 
Bad Company 2 is attempting to simulate real combat effects. Building blows up you know there may be debris and dust that will affect your vision. I just don't understand gamers. Longing for more unrealistic games?? Makes no sense and comes off as complaining to complain. If DICE had made BC2 on the same engine as BF2 with tweaks there would be crying on this very forum to high heaven. For the record when ish blows up there will be dust to impair your vision. In the desert many times sun will impair your vision also. So in short what this thread teaches us is that PC gamers who always pride themselves with buying the latest and greatest video cards year in and year out and clown console fanboi's for their under powered gaming machine of choice would like Dev's to tone down the graphics?

I didn't read the entire thread so someone set me straight if I'm off base here.

Umm, you're way off the mark, maybe you shoulda read the thread before replying :p

What we're talking about isn't dust when something explodes, which is cool.

We're talking about things like bloom, HDR and overdone lighting. ;) Which make a game look very unrealistic.
 
Sorry, I couldn't hear you over the Bloom, lens flares, and extreme bump mapping. I'll come back later and make a reasonable post.

tl;dr for now.
 
Umm, you're way off the mark, maybe you shoulda read the thread before replying :p

What we're talking about isn't dust when something explodes, which is cool.

We're talking about things like bloom, HDR and overdone lighting. ;) Which make a game look very unrealistic.

Yeah I actually did understand then so I just disagree when it comes to BC2. I think the lighting is done well. To each his own I guess.
 
Umm, you're way off the mark, maybe you shoulda read the thread before replying :p

What we're talking about isn't dust when something explodes, which is cool.

We're talking about things like bloom, HDR and overdone lighting. ;) Which make a game look very unrealistic.

What's wrong with this whole topic, isn't that there is anything wrong with bloom, HDR and "overdone lighting", it just comes down to poor execution. Don't blame the effect, blame the developer. Because company A can't use bloom properly, doesn't mean company B will also.

There is nothing wrong with bloom. Bloom isn't some just made up feature 'to cover up bad graphics' as some of you are stating.

Here is a real life picture which includes bloom.
HrdiBloomExample.jpg


In image processing, computer graphics, and photography, high dynamic range imaging (HDRI or just HDR) is a set of techniques that allow a greater dynamic range of luminances between the lightest and darkest areas of an image than standard digital imaging techniques or photographic methods. This wider dynamic range allows HDR images to more accurately represent the wide range of intensity levels found in real scenes, ranging from direct sunlight to faint starlight.[1]

Some of you are also saying game dev's "created HDR", or Bloom to hide bad graphics. Based off this definition from wiki, I don't see how you use "HDR" to "hide" anything. All it is doing is allowing a greater dynamic range of luminance between light and dark areas. Which really would help bring out more detail in areas that would naturally be to dark, quite the opposite of "hiding".

As for "over done lighting". Games are different, some dev's are trying to set certain moods with lighting.

What it comes down to is you can't please everyone. You are going to have people who like the way their old games look and want their new games to look and feel like what they've been playing, and then you have people who want their games to look more photorealistic.

On a side note, the reason I called that one guy an idiot, because anyone stating games from the current gen look like games from 2003, is an idiot.
 
The issue with most HDR implementations is that they lack a decent adaptive/dynamic tone mapping system. Without tone mapping, HDR rendering isn't entirely useless, but it's realistically just painting the world in a constant layer of blooms, and that really does defeat the purpose of HDR rendering in the first place. Oblivion, for instance, has dynamic tone mapping, but the eye adaptation rate is instantaneous and the whole thing falls apart because of that.

Bloom can be used as a relatively cheap effect to "soften up" a scene, but developers tend to take it too far. Combined with excessive post-process color desaturation, images become too soft and lacking in detail to be pleasing.

But, yes, excessive post-processing does help mask otherwise bland rendering. It achieves a "next gen" type appeal without actually being appealing for people who really know what they're looking at, and it's used because the performance hit of bloom and color correction is relatively inexpensive.
 
one small thing though, if you were to look at that window in real life, your eyes would adjust and it wouldn't be that "glarey"... the effects of the particular camera doesn't necessarily mean it's realistic...

otherwise i generally agree, especially about the dude talking about 2003... :p
 
What's wrong with this whole topic, isn't that there is anything wrong with bloom, HDR and "overdone lighting", it just comes down to poor execution. Don't blame the effect, blame the developer. Because company A can't use bloom properly, doesn't mean company B will also.

There is nothing wrong with bloom. Bloom isn't some just made up feature 'to cover up bad graphics' as some of you are stating.

Here is a real life picture which includes bloom.
HrdiBloomExample.jpg




Some of you are also saying game dev's "created HDR", or Bloom to hide bad graphics. Based off this definition from wiki, I don't see how you use "HDR" to "hide" anything. All it is doing is allowing a greater dynamic range of luminance between light and dark areas. Which really would help bring out more detail in areas that would naturally be to dark, quite the opposite of "hiding".

As for "over done lighting". Games are different, some dev's are trying to set certain moods with lighting.

What it comes down to is you can't please everyone. You are going to have people who like the way their old games look and want their new games to look and feel like what they've been playing, and then you have people who want their games to look more photorealistic.

On a side note, the reason I called that one guy an idiot, because anyone stating games from the current gen look like games from 2003, is an idiot.

Of course it comes down to personal preference. Anything I say is my opinion on what I like and would prefer in games. Bloom and HDR could look great if they were done properly, but they're not, devs definitely use them to cover bad graphics, bad textures, bad shadows, bad lighting, you dont notice them because everything is bloody glowing constantly (or alternatively too dark to see properly). Correctly applied HDR and bloom weren't originally created for this purpose, but they're definately used for it.

Devs use them to give the "wow" factor... unfortunately the wow factor for me and many others wore off years ago and now we'd like some realistic textures, realistic shadows, realistic lighting. For me the wow factor wore off by the time I was half way through Oblivion, that was the first "current" gen game I played (on PC though) and when I first turned on HDR I was really impressed. By half way through the game it was getting on my nerves how everything was glowing and I turned it off again.

On my wall I have a poster of a car that I helped build, and a friend photographed it then photoshopped it to make it "look good" similar to how game devs use lighting, bloom and HDR. It blew me away when I first saw it, but after a while of it hanging on my wall the special effects wore off and now I'd rather just have a basic, good quality photo of the car racing on the track, instead of the filtered thing which hides the car behind a veil of hyper-realistic lighting.

Of course these effects can be used to create a more realistic experience, of course they can be used to set the mood and feel of an environment. However far more often than not they are just used to create an over exposed image which hides otherwise bad lighting, models and textures. These effects are just overused to the point where all games these days look the same, over exposed.

One game I played recently that exemplifies this is NFS: Shift. Why in the heck does a car racing game need to look so horribly over exposed and washed out? Then I turned off the effects and realised behind it all it was similar quality to any of the old racing sims I've played (though admittedly the cockpits were still nice ;)).
 
Last edited:
one small thing though, if you were to look at that window in real life, your eyes would adjust and it wouldn't be that "glarey"... the effects of the particular camera doesn't necessarily mean it's realistic...

otherwise i generally agree, especially about the dude talking about 2003... :p

It would appear to be bright or "glary" before your eyes adjust. That’s how bloom should work. Walking from a dark place to a bright place, your pupils are enlarged, allowing more light in, before adjusting, shrinking of the pupil, thus taking away from the "bloom" effect.
 
one small thing though, if you were to look at that window in real life, your eyes would adjust and it wouldn't be that "glarey"... the effects of the particular camera doesn't necessarily mean it's realistic...
That's dynamic tone mapping. Exposure control, essentially, based on a basic interpretation of the scene. Enter a dark room and the lighting adjusts the exposure so you can make out detail in shadows. Exit into daylight and the lighting adjusts slowly as blooms become less obvious and the scene is no longer blown out.
 
LOL, guys... I know what it is, that's why I pointed it out :p

my point was, and perhaps I wasn't clear enough, is that you can't just go on photos and videos of real world, because they don't necesarrily adjust like our eyes do, and maybe that is part of the reason games are sometimes too dark or too bright. I realize they have been trying to make it auto-adjust since like, what HL2(?), I don't remember...
 
No kidding! I'm on board with everyone here. I remember when I used to play Delta Force 2. That game was the shit. Wide open huge MP maps, everything depended on your skill. There were no stupid effects. Now everything is like a damn movie.

Go play Arma 2. the last game ever to have wide view distances and none of that blur and boom shit
 
I actually don't think Bloom and HDR are to blame by themselves. The real issue is how designers are using them. IMO Half-Life 2 is still one of the more realistic looking games (at high res) and it uses those effects properly.
It seems like when the 360 and PS3 hit everyone went crazy with bloom effects and that's what the OP is referring to. When abused, HDR/Bloom effects are torture.
 
overdone bloom and overbright lighting is to cover up shit textures and low poly models to reduce load times on consoles. get a console ported game on PC and disable advanced lighting an bloom and all that jazz and you'll see how bad the textures and models are, straight from 1999.
 
In many cases games need this next gen lighting to show off all the bumps and details, but with regular "classic" lighting you really cant make it out so everything looks kinda flat.

So when everything looks like crap when you turn off the HDR (or other) lighting, its actually taking away the shadows/refraction that give objects more detail.

Look at a wet rock in any new game, then turn off the special lighting, the rock will sometimes (depending on the technique) look bland.
 
Back
Top