I'm tired of the PC games market. Console at 30FPS, PC at 60FPS.

Status
Not open for further replies.
Ok, you are a superman who see the difference between a console and a PC.
I'm a normal person and I don't see the difference between 30FPS on console and 60FPS on PC,
but I see a huge difference between a game that runs at 30FPS on a console and a game that runs at 30FPS on a PC.

Why they don't let us choose the framerate target?

I know that there is software like evga precision that cap the framerate but I don't mean that kind of cap, I mean something like the real framerate target used on console.

There's a very well done series of studies that basically show the higher the resolution and the higher the detail of an image, the more choppy it appears at a slower frame rate as there's less 'blurring' and our mines see/associate blurring especially between frames of video for example as 'motion' and when we perceive motion we perceive fluidity.

If you want to have your PC game feel as smoothly as a console game... you just need to lower your resolution all the way down to console level or about 1024x768 and slide those detail sliders down to roughly medium. At that point, you will be playing with about the same amount of detail as what a console natively runs at. You might need to even reach 800x600 to get the pixel count down though as pixels are a lot more dense over a 19" monitor than a 60" screen.
 
Amazing troll is amazing. Though I'm not sure why someone with just under 4 years on their account would risk being banned over starting such a ridiculous bait thread.

Honestly, I think that he's legitimate in his concern. It's just he's having trouble articulating what he's seeing. It's quite common that with very high detail, objects appear less smooth. More than likely, he's simply playing at such a higher resolution on his pc compared to his console it appears less smooth -- even at equal frame rates. If he lowers his details on his pc to that of his console, his gameplay and overall experience will suffer (b/c consoles look ugly) but he'll get the smoothness he craves.

Otherwise, I don't know what the OPs point of starting this thread would be other than to vent. It's not like we can magically make his video card maintain 60 fps...even limiting his fps to 30s via vsync won't truly work as he's still suffering form the issue of having too much detail on the screen at any given time. Hopefully, he'll take my suggestoin of putting all those detail slider bars on medium and dropping down the resolution to console level (less than 720p).
 
Is there a single reason why they don't let gamers choose if "calibrate the animations" for 30 or 60FPS?
Because there's no such thing. The game ticks; the GPU draws either actual, genuine ticks or interpolated ticks. That's it.

There's no great solution for video games to match the motion blur quality of films.
Not necessarily the case anymore. Good real-time motion blur techniques still have some IQ pitfalls, but they're doable performance-wise (one or two milliseconds per frame in some cases). Like you said, though, input responsiveness suffers at much lower frame rates, so it's not the best of trade-offs at the moment.
 
Does Frame-Pacing vary from console to pc, or movies to pc?
I am assuming based on Unknown-Ones post that Frame-Pacing is just as important as FPS?
If by frame pacing you mean the evenness of frame delivery, yes. Otherwise, I'm not sure what you're trying to ask.
 
I figure that most PC games are CONSOLE PORTS ;)
so please, shut up if you don't know :)



I'm not confusing, I'm lighting kids on how a PC animation works.



This is true for most games.
A games that runs really bad on 40FPS is the proof, I'm not talking about stutter, I'm talking about fixed 30FPS, or fixed 40FPS.
A games that runs choppy at fixed 30FPS is the proof of what I'm saying.
Fix the framerate of a game at 30FPS and tell me what games don't runs choppy.

This MEANS THAT the games animations "is calculated on 60FPS".
I'm not saying that 30FPS is enough to be comparable to 60FPS, I mean that if a game engine is able to downscale the animation to 30FPS, 30FPS is enough to play decently.
Since this is not true, playing PC games at 30FPS is a scandal, not for the framerate but for the engine doesn't do what it does on console.

I don't have sources or facts, but I'm sure enough to bet money that most consoles aren't locked at any frame rate... They are limited to a frame rate. Like I said I don't think gta5 reaches 30fps even once. Same for gta4. And Max Payne 3. And Skyrim. Far Cry 3 was choppy as hell and frame rates were just pathetic. I could go on, so could everyone else here.
 
Last edited:
Console framerate is horrible under 30, remember that most LCD TV are crap and have tons of lag and motion blur compared to any descent PC monitor. My friend used he's dell PC monitor with his ps3 and it was truly horrible, since most games where upscaled by the console it was when worst
 
I am assuming based on Unknown-Ones post that Frame-Pacing is just as important as FPS?

My post had NOTHING to do with frame pacing. I was simply pointing out how incomparable film is to real-time rendering.
 
Do your research. It's an edge detecting post filter.

fxaa blurs textures.You lose IQ. It's a known fact. Load up some nice ultra/high texture of a gravel/dirt in skyrim. Activate fxaa and u get med looking texture that lose all edges and looks smudged.

Check Dark soul 2 console vs pc videos. Those stone walls are amazing!(pc):cool:
 
How would SSAA blur the screen? FXAA most certainly does blur a little but the level of blurriness can vary.

It creates pixels from a higher res sample. This creates colors and shades not native to the intended image which is then perceived as blur.

Higher res screens are the only foolproof option.
 
fxaa is a blur filter. it's a joke.
The intention of FXAA is to use edge-detection to blur only aliased edges.

Hardly a joke, as long as the algorithm works as intended. Removes aliasing and comes with a fairly small performance hit.

Also, the higher the resolution of the source material, the more accurate FXAA becomes. This means that FXAA is great for high-resolution displays (where normal AA modes carry an extreme performance hit).
This also means FXAA can be used to enhance the effectiveness of SSAA (in cases where FXAA is applied before SSAA scales the image down to screen resolution).

fxaa blurs textures.You lose IQ. It's a known fact. Load up some nice ultra/high texture of a gravel/dirt in skyrim. Activate fxaa and u get med looking texture that lose all edges and looks smudged.
That can be avoided by using a higher resolution display (or combining FXAA with low-level SSAA).

The higher resolution the source image, the better FXAA's algorithm works, which avoids mistakes like accidentally blurring textures.

If FXAA is a "blur filter", so is MSAA. And SSAA, for that matter.
MSAA and SSAA are not "blur filters"...

MSAA = Multisample anti-aliasing
Pixels along the edges of polygons are sampled multiple times. This builds up additional data about these pixels that can then be averaged together to generate one hyper-accurate pixel. This will remove aliasing along the edges of objects, but will not help shader aliasing.

SSAA = Supersample anti-aliasing
A render target is created that is higher resolution than your actual display (Usually 2 or 4 times higher resolution). This high-resolution source image is then scaled down to screen resolution, a process which involves averaging groups of pixels into single hyper-accurate pixels. This will remove all forms of aliasing.

Both of the above add MORE data to the final image than running without any form of AA. This is the opposite of bluring, which throws-out data.

It creates pixels from a higher res sample. This creates colors and shades not native to the intended image which is then perceived as blur.
I'm really not sure what you're talking about here. SSAA creates a hyper-accurate image, not a blurred image :confused:

Not only are colors more accurate (since the color of each pixel is derived from sampling the data that makes up said pixel multiple times), but the position of objects on-screen is more accurate as well.
With no AA, a single-pixel moving across the screen will jump from pixel-to-pixel in-full. There's no transition. If the pixel is supposed to be exactly between two physical pixels of the display, it has to be clipped to one or the other.
With SSAA, a single-pixel moving across the screen will fade from one pixel to the next. Smooth transition. If a pixel is supposed to be exactly between two physical pixels of the display, both pixels will display 50% of single-pixels' color information.
 
Last edited:
You are still trying to force it through a screen of finite resolution. This will cause anomalies. That's what I'm talking about.
 
It is really ridiculous.
PC gamers need more than double the power of console gamers to run the same game at the same quality.

The cause for this variance in power consumption can be understood upon looking at the gamers configurations; PC gamers sit in chairs when playing the game. Console gamers a more reclined and cushioned - couches are very popular with this segment - and hence console gamers expend less power playing the same game.
 
Status
Not open for further replies.
Back
Top