I'm tired of the PC games market. Console at 30FPS, PC at 60FPS.

Status
Not open for further replies.

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,759
It is really ridiculous.
PC gamers need more than double the power of console gamers to run the same game at the same quality.

All gamers talks about optimizations, ok this is a factor that influences the performance but what about the framerate?
If I look a console game at 30FPS it runs flawlessy, if I look a PC games at 30FPS it runs like a crap because PC games are unoptimized for 60FPS.

Is there a single reason why they don't let gamers choose if "calibrate the animations" for 30 or 60FPS?
I really don't see difference between a 30FPS console game and a 60FPS PC game, so why they don't let us choose to run games at 30FPS?
 
I feel like you're trolling.

A lot of the reason MOST people find 30FPS on consoles passable is because they are already playing on huge TV's that have input lag out of the gate, using a controller and are generally "casual" in nature.

The folks who play on PC are using monitors with little to no input lag, a mouse and keyboard and generally care more or are more in-tune with the games they play and how they run.

I bought another PS3 recently for GTA5 and The Last of Us, both of those games were unenjoyable to almost unplayable for me because of the frame rate and perceived slowness. I literally felt like I was playing at 15FPS at times in both of those games ( and probably was).

If you play a PC game with an xbox controller having 30 FPS or things like v-sync lag are less noticeable than they are with a KB/M.
 
I feel like you're trolling.

A lot of the reason MOST people find 30FPS on consoles passable is because they are already playing on huge TV's that have input lag out of the gate, using a controller and are generally "casual" in nature.

The folks who play on PC are using monitors with little to no input lag, a mouse and keyboard and generally care more or are more in-tune with the games they play and how they run.

I bought another PS3 recently for GTA5 and The Last of Us, both of those games were unenjoyable to almost unplayable for me because of the frame rate and perceived slowness. I literally felt like I was playing at 15FPS at times in both of those games ( and probably was).

If you play a PC game with an xbox controller having 30 FPS or things like v-sync lag are less noticeable than they are with a KB/M.

Ok, you are a superman who see the difference between a console and a PC.
I'm a normal person and I don't see the difference between 30FPS on console and 60FPS on PC,
but I see a huge difference between a game that runs at 30FPS on a console and a game that runs at 30FPS on a PC.

Why they don't let us choose the framerate target?

I know that there is software like evga precision that cap the framerate but I don't mean that kind of cap, I mean something like the real framerate target used on console.
 
Lolol.gif
 
I don't know what he is trying to say, but I will say that PC needs more exclusives and less shitty console ports.
 
The human can eye tell the difference between 400FPS and 430FPS clearly.

If you can't see the difference between 30FPS and 60FPS you should go to a doctor.

The US air force in tests found out that the human eye can see each individual frame at 200FPS and extract information out of each one.
 
Ok, you are a superman who see the difference between a console and a PC.
I'm a normal person and I don't see the difference between 30FPS on console and 60FPS on PC,
but I see a huge difference between a game that runs at 30FPS on a console and a game that runs at 30FPS on a PC.

Why they don't let us choose the framerate target?

I know that there is software like evga precision that cap the framerate but I don't mean that kind of cap, I mean something like the real framerate target used on console.

you have GOT TO BE kidding me! How can anyone not NOT notice that GTA5 runs at 15fps on the xbox360? I bought that game not wanting to wait for the PC release, played it less than an hr and never picked it up again... it runs so shitty its not even enjoyable. There is a massive difference between gaming at 30fps and 60fps. you just dont know the difference because youve always gamed at 30fps or less.


edit: set your vsync to sync on 2 frames instead of one (some games have this setting). enjoy your crappy 30fps
 
Ok, you are a superman who see the difference between a console and a PC.
I'm a normal person and I don't see the difference between 30FPS on console and 60FPS on PC,
but I see a huge difference between a game that runs at 30FPS on a console and a game that runs at 30FPS on a PC.

Why they don't let us choose the framerate target?

I know that there is software like evga precision that cap the framerate but I don't mean that kind of cap, I mean something like the real framerate target used on console.
You can always set your monitor to 30 Hz and turn Vsync on...
you have GOT TO BE kidding me! How can anyone not NOT notice that GTA5 runs at 15fps on the xbox360? I bought that game not wanting to wait for the PC release, played it less than an hr and never picked it up again... it runs so shitty its not even enjoyable. There is a massive difference between gaming at 30fps and 60fps. you just dont know the difference because youve always gamed at 30fps or less.


edit: set your vsync to sync on 2 frames instead of one (some games have this setting). enjoy your crappy 30fps
That is the primary reason I stopped playing GTAV on PS3. The framerate constantly chuggs along happily at 20 FPS or less. Forget about 30 FPS, I want to know when <30 FPS all of the sudden became acceptable...
 
Something seems amiss here. Maybe you have a problem somewhere in your system that's causing issues w/gaming. Which I can't tell a huge difference between 30-60fps, honestly those just feel slow once you get used to 120/144hz.
 
I can very easily tell the difference between 30fps on either console or PC and 60 fps. There are many console games that target 60 fps because of how much better the gameplay is at the speed (FPS and racing games usually).
 
The reason you think 30FPS runs flawlessly on a console is due to the added motion blur they add to cover of the low FPS. Once that feature is turned off then you wont like 30FPS very much.
 
The human can eye tell the difference between 400FPS and 430FPS clearly.

If you can't see the difference between 30FPS and 60FPS you should go to a doctor.

The US air force in tests found out that the human eye can see each individual frame at 200FPS and extract information out of each one.

The interesting thing about this is that the human eye does not see in frames per second. That's something the "the human eye cannot see anything over 60 frames herpa derp" crowd never seem to understand. You know who I'm talking about. Any time there's a frame rate discussion on a mainstream FPS game forum, like BF4, these people chime with their matter-of-fact demeanor.
 
The reason you think 30FPS runs flawlessly on a console is due to the added motion blur they add to cover of the low FPS. Once that feature is turned off then you wont like 30FPS very much.

Not to mention that having a console running at 30fps most of the time with a few dips into the 20s range isn't as "jarring" as having a PC running with 60hz vsync enabled with frame drops. For the most part, the console running 30 fps is more consistent (this of course all depends on the PC running the game). I still like to have my >30 fps though.
 
It is really ridiculous.
PC gamers need more than double the power of console gamers to run the same game at the same quality.

All gamers talks about optimizations, ok this is a factor that influences the performance but what about the framerate?
If I look a console game at 30FPS it runs flawlessy, if I look a PC games at 30FPS it runs like a crap because PC games are unoptimized for 60FPS.

Is there a single reason why they don't let gamers choose if "calibrate the animations" for 30 or 60FPS?
I really don't see difference between a 30FPS console game and a 60FPS PC game, so why they don't let us choose to run games at 30FPS?

3.6 years on [H] and you post this ..
Making [H] look bad bro.
 
The interesting thing about this is that the human eye does not see in frames per second. That's something the "the human eye cannot see anything over 60 frames herpa derp" crowd never seem to understand. You know who I'm talking about. Any time there's a frame rate discussion on a mainstream FPS game forum, like BF4, these people chime with their matter-of-fact demeanor.

12-16 FPS is enough to make human perceive motion, early films and early animation used frame rates as slow as this

24-30 FPS in film became the standard by 1930. With how exposure works to blur and blend frames together, film can get away with a lower frame rate. There's no great solution for video games to match the motion blur quality of films. And you can feel the delay of response when playing video games at 24 FPS.

48-60 FPS pretty much a "new" format for film that's being tested. The purpose is of it is to reduce motion blur and flicker in order to produce a sharper and smoother looking movie. There's a obvious and immediate difference when watching a movie at 24 FPS and 48 FPS and people often describe it as feeling more life-like and smooth. For video games 30 FPS -> 60 FPS is a similar comparison. For those who care about games it's an obvious and immediate difference. Also the power grid works at 50hz-60hz depending on where you live. Lights actually pulse at 50-60 FPS because of this, but you'd never notice it because your eyes don't pick up on it because of how fast it happens. But if you record a video that's frame rate is out of sync with your power grid, you'll often see wavy and moving lines from the light.

120 FPS is the "high end" of current media, games, TVs, and monitors. Of course there stuff that goes higher, and there's some TV's that support 240hz for 120hz in 3d, but you receive diminishing from going to 120 from 60, and even worse going from 120 beyond.

240 FPS starts to reach the maximum a human eye can see and react to. Your eyes/brain react differently to brightness, darkness, color, and smoothness of motion, and what frame rate you can "see" is going to relies heavily on those factors. "Tests with Air force pilots have shown, that they could identify the plane on a flashed picture that was flashed only for 1/220th of a second." But you have to remember lights in are homes pulse at 60hz, displays often switch to black between frames at 60hz, but we don't notice those things. And apparently with VR, 95hz refresh to black is enough for very good persistence.
 
The human can eye tell the difference between 400FPS and 430FPS clearly.

Can you post a scientific document where is written that a human can tell the difference between 400 and 430FPS?
This are bosh ;)

There is a massive difference between gaming at 30fps and 60fps. you just dont know the difference because youve always gamed at 30fps or less.
edit: set your vsync to sync on 2 frames instead of one (some games have this setting). enjoy your crappy 30fps

It is clear that many of you that are replying is such a "strong way" don't even know how a computer animation is done.
If an animation is calculated do to be played at 800 frames per seconds, this animation will run crappy if played at 400 frame per seconds.
this is what most people answering this post don't know, and this is clear for this kind of reply, so please don't try to teach something you don't know.

If an animation is calculated do to be played at 30 frames per seconds,
this animation will run great for every normal people
if played at 30 frame per seconds.

Films runs at 24FPS, do you see the choppiness?
Please, don't talk about blur, you don't even know what is it.

You can always set your monitor to 30 Hz and turn Vsync on...
That is the primary reason I stopped playing GTAV on PS3. The framerate constantly chuggs along happily at 20 FPS or less. Forget about 30 FPS, I want to know when <30 FPS all of the sudden became acceptable...

this is the proof that most of you don't know what they are talking about.
PC games are made to be runned at 60FPS.
only few games are able to change it's rendering target on the go. very few one.

this means that if you play a games that has a rendering target set to 60FPS on a 30Hz monitor you will see a choppy animation, if you will play the same game at 120FPS you will see no improvement since the game has only 60frame for the animation, the "empty" is simply copied.

many of the kiddy experts here who claims to see big improvements while playing at 120FPS don't even know that they are seeing only 60FPS and the other 60FPS is simply a copy of the other one.

I can very easily tell the difference between 30fps on either console or PC and 60 fps. There are many console games that target 60 fps because of how much better the gameplay is at the speed (FPS and racing games usually).

I can easily tell the difference between 30FPS on console and 30FPS on PC,
I cannot easily see the difference between 30FPS on console and 60FPS on PC.

I'm really really sure, that very few people is able to do it but all kiddy says the opposite :)
 
It's probably hard to do.

I remember running Crysis 1 on a slow core 2 duo and while the FPS was okay, I thought that the physics animations were happening in slow motion. Once I got a quad core, the fps was higher, sure, but the trees I was chopping down were literally falling faster.

So I get what you're saying, it's wonky, but I suppose it's the nature of PC gaming where different PCs will have wildly different computational capabilities.

In the end, tweet a developer and ask, because I don't think they hang out on [H]
 
Oh this is an oldie but a goodie. Kinda like 1911 vs. Glock or AK vs. AR.
 
this is the proof that most of you don't know what they are talking about.
PC games are made to be runned at 60FPS.
only few games are able to change it's rendering target on the go. very few one.

You couldn't made it more clear. This is proof that YOU don't know what are you talking about.

PC games are NOT made to run at 60FPS. Only a handfull have a FIXED 60FPS like Rage for example. Most games that have fixed 60FPS are CONSOLE PORTS. Go figure.

A PC game will render as fast a the video card can. You can FORCE it to a max 60FPS by Vsync (or 120fps with 120hz monitors).
 
You couldn't made it more clear. This is proof that YOU don't know what are you talking about.

PC games are NOT made to run at 60FPS. Only a handfull have a FIXED 60FPS like Rage for example. Most games that have fixed 60FPS are CONSOLE PORTS. Go figure.

A PC game will render as fast a the video card can. You can FORCE it to a max 60FPS by Vsync (or 120fps with 120hz monitors).

I think he's confusing the way some game engines handle physics and animation. Take Skyrim, for example. When Skyrim is played beyond 60fps, say 120fps, it causes all sorts of wonky shit because the game's physics and animation are governed by the frame rate. Coincidentally, Skyrim is a console port lol.

Just check his signature, he's using a 60hz monitor. As always, until someone uses a good CRT or a 120/144hz monitor, it's best to reserve bold claims.
 
I'm thinking that the OP is running his consoles a 120hz (or 240hz) tv with trumotion enabled.
 
It is clear that many of you that are replying is such a "strong way" don't even know how a computer animation is done.
If an animation is calculated do to be played at 800 frames per seconds, this animation will run crappy if played at 400 frame per seconds.
this is what most people answering this post don't know, and this is clear for this kind of reply, so please don't try to teach something you don't know.

If an animation is calculated do to be played at 30 frames per seconds,
this animation will run great for every normal people if played at 30 frame per seconds.
I won't get into personal attacks here. In video games an animation has key frames and it is played out over a fixed amount of time. They are not set to run at a fixed frame rate. The rest of the animation is interpolated based on how fast the game's update loop is running so it appears smooth and natural regardless of the frame rate. Interpolation functions calculate how much the model transforms in between frames. But there are a lot of games out there, namely console ports, that lock the engine to the tick rate, which causes sync issues that will make the animation appear abnormal when the frame rate is faster than the tick rate. L.A. Noire is a perfect example of this. While the developers make the excuse that the voice would be out-of-sync with the facial animations outside of 30 FPS, the fact is this is just an excuse for their old programming methodology and how they decided to frame capture the actors' faces.
Films runs at 24FPS, do you see the choppiness?
Please, don't talk about blur, you don't even know what is it.
You cannot compare real videography to video game graphics, so I don't know what he was on about here.
this is the proof that most of you don't know what they are talking about.
PC games are made to be runned at 60FPS.
only few games are able to change it's rendering target on the go. very few one.

this means that if you play a games that has a rendering target set to 60FPS on a 30Hz monitor you will see a choppy animation, if you will play the same game at 120FPS you will see no improvement since the game has only 60frame for the animation, the "empty" is simply copied.
PC games are generally made to run at whatever frame rate the hardware will allow. The update loop in the game is based on the amount of time passed when programmed properly, so the game will run at the frame rate based on the amount of time it takes the hardware to process one update loop. The refresh rate of the engine will make sure that everything stays in sync.

many of the kiddy experts here who claims to see big improvements while playing at 120FPS don't even know that they are seeing only 60FPS and the other 60FPS is simply a copy of the other one.
Again, this might be true if the game's update loop was tied to the tick rate. This was the way of doing things when you were programming the game on one hardware configuration. The PC has millions of possible hardware configurations and developers can't get away with locking a game to the tick rate anymore. Look at all the complaints whenever a new Need for Speed is released. A game's update loop cannot run continuously due to the nature of CPUs. So the update loop will calculate what happened in between frames based on how much time has passed. The game isn't presenting you frames at a fixed rate.
I can easily tell the difference between 30FPS on console and 30FPS on PC,
I cannot easily see the difference between 30FPS on console and 60FPS on PC.

I'm really really sure, that very few people is able to do it but all kiddy says the opposite :)
It is quite obvious what the difference is. I recently saw my dad playing Test Drive Unlimited 2 on the Xbox 360, and I was appalled by both the frame rate and how bad the controls responded at 30 FPS compared to the PC version at 60+ FPS.

Not to mention that having a console running at 30fps most of the time with a few dips into the 20s range isn't as "jarring" as having a PC running with 60hz vsync enabled with frame drops. For the most part, the console running 30 fps is more consistent (this of course all depends on the PC running the game). I still like to have my >30 fps though.
It sounds like you have bought into the console marketing speak when they are still trying to justify the use of 30 FPS. I play BF4 on a 144 Hz monitor and my frame rates will often range from 50-110 FPS in multiplayer. There is nothing jarring about it. Are you people who say this playing with frame interpolation turned on on your TVs (TruMotion, et al)?
 
You couldn't made it more clear. This is proof that YOU don't know what are you talking about.
Only a handfull have a FIXED 60FPS like Rage for example. Most games that have fixed 60FPS are CONSOLE PORTS.

I figure that most PC games are CONSOLE PORTS ;)
so please, shut up if you don't know :)

I think he's confusing the way some game engines handle physics and animation. Take Skyrim, for example. When Skyrim is played beyond 60fps, say 120fps, it causes all sorts of wonky shit because the game's physics and animation are governed by the frame rate. Coincidentally, Skyrim is a console port lol.

Just check his signature, he's using a 60hz monitor. As always, until someone uses a good CRT or a 120/144hz monitor, it's best to reserve bold claims.

I'm not confusing, I'm lighting kids on how a PC animation works.

Again, this might be true if the game's update loop was tied to the tick rate. This was the way of doing things when you were programming the game on one hardware configuration. The PC has millions of possible hardware configurations and developers can't get away with locking a game to the tick rate anymore.

This is true for most games.
A games that runs really bad on 40FPS is the proof, I'm not talking about stutter, I'm talking about fixed 30FPS, or fixed 40FPS.
A games that runs choppy at fixed 30FPS is the proof of what I'm saying.
Fix the framerate of a game at 30FPS and tell me what games don't runs choppy.

This MEANS THAT the games animations "is calculated on 60FPS".
I'm not saying that 30FPS is enough to be comparable to 60FPS, I mean that if a game engine is able to downscale the animation to 30FPS, 30FPS is enough to play decently.
Since this is not true, playing PC games at 30FPS is a scandal, not for the framerate but for the engine that doesn't do what it does on console.
 
30 frame rates on console looks like 30 frame rates on PC to me. I find 30 frame rates to be a bit low. Keep in mind my experience is from the PS2 and N64.
 
Films runs at 24FPS, do you see the choppiness?
Please, don't talk about blur, you don't even know what is it.
When a computer renders 24 frames every second, each frame only contains data that was relevant for a fraction of a millisecond. This effectively means you're viewing a series of still images with large information gaps in-between.

When you record to film at 24 FPS, each frame contains up to 41 ms worth of data mushed together. For a computer to even come close to emulating this, it would need to render at somewhere around 1000 FPS (and blend every group of 41 frames into a single frame).

To answer your question: Yes, of course I see choppiness on 24 FPS film content... but it's mostly-bearable because of the way film works. As long as the camera's shutter speed isn't set too fast, the choppiness doesn't manifest grossly all that often.
 
Time to glean what information what I can since everybody always talks about this.

Does Frame-Pacing vary from console to pc, or movies to pc?
I am assuming based on Unknown-Ones post that Frame-Pacing is just as important as FPS?
 
It's probably hard to do.

I remember running Crysis 1 on a slow core 2 duo and while the FPS was okay, I thought that the physics animations were happening in slow motion. Once I got a quad core, the fps was higher, sure, but the trees I was chopping down were literally falling faster.

So I get what you're saying, it's wonky, but I suppose it's the nature of PC gaming where different PCs will have wildly different computational capabilities.

In the end, tweet a developer and ask, because I don't think they hang out on [H]
Um that is odd because Crysis 1 does not even effectively use anything beyond 2 cores. Even with an oced i5 it will drop down into the 40s in spots.
 
On your console your controller has so much lag you cant even feel it, your ability to turn with the controller is so slow you don't notice it. Run a game at 30 fps and move your mouse very slowly, it will look alright. Now whip your mouse in a split second 180 turn and see what happens. Looks horrible now try to do that on a console, oh you cant because your turn rate is limited by the max turn rate set by the game and the joystick.

The second I sit down at a console I notice the lag and low FPS right away. Most console gamers have just adapted to it and to playing with it so they notice it less.
 
This is one of the more strange threads I've ever seen on [H].

I can't imagine why anyone would prefer 30FPS over 60FPS, but different strokes for different folks I guess...
 
The second I sit down at a console I notice the lag and low FPS right away. Most console gamers have just adapted to it and to playing with it so they notice it less.

Pretty much this. There are exceptions (most Nintendo games, BF4 on the PS4, most fighting games), but generally speaking I can't stand the jerkiness on consoles. Dark Souls was a fun game, but it pissed me off like none other on the 360. On the PC? Way different story.

Of course, frame rate isn't everything. I thought Crysis was actually pretty fluid at 40fps, while Skyrim at 60fps is still jerky (and not just the animations).
 
Amazing troll is amazing. Though I'm not sure why someone with just under 4 years on their account would risk being banned over starting such a ridiculous bait thread.
 
It sounds like you have bought into the console marketing speak when they are still trying to justify the use of 30 FPS. I play BF4 on a 144 Hz monitor and my frame rates will often range from 50-110 FPS in multiplayer. There is nothing jarring about it. Are you people who say this playing with frame interpolation turned on on your TVs (TruMotion, et al)?

No, I haven't bought into any console marketing speak. My example was a pretty specific one. In full disclosure, I play with CRT's and usually set their refresh rate to 75hz (it looks sharper to my eye than 85hz, and the flickering doesn't bother me even with 21"+ monitor sizes), and leave VSync off and let the framerate cards fall where they will. I usually shoot for minimums above 40 fps.

As for your example, I agree that 50-110 FPS isn't jarring, but I would say that dropping from 60fps to 30 and back up again in an instant (VSync) is more jarring an experience than a constant 30. Honestly - I think we're all probably being trolled here, but I wanted to explain myself a bit. ;)

EDIT: And I would also say that 50 fps minimum is excellent. So even if you fall from 110 fps down to 50, you're still way above 30 fps, and your game will feel more peppy than someone who falls from 60 to 30. In addition, console controls lend themselves pretty well to 30 fps gaming.
 
Status
Not open for further replies.
Back
Top