Where are the 240 Hz video cards?

pinoy

Limp Gawd
Joined
Dec 8, 2010
Messages
447
We all seen them. Pretty much the majority of new TVs sold today have 120, 240, 6000 Hz frame interpolation capabilities. It's a pretty neat feature, I think. Fortunately there's Smooth Video Project for computer users, but when will video cards start implementing that in hardware?
 
Before ANYONE flames you, I'll let you down gently. The TVs that 'support' 240, 400, 1000hz don't actually support that refresh rate as an input, they usually only support a 50-60Hz input, and then use software to blend the frames together. The fastest monitors and TVs available commercially only allow a 144Hz input.
 
Before ANYONE flames you, I'll let you down gently. The TVs that 'support' 240, 400, 1000hz don't actually support that refresh rate as an input, they usually only support a 50-60Hz input, and then use software to blend the frames together. The fastest monitors and TVs available commercially only allow a 144Hz input.

OP knows that..
 
those are not really that fast most of them are still plain Jane LCDs that refresh at a solid 60hz... cheap 4K TV may only do 30hz... they feed you some marketing BS that leads you to believe that they are doing 120.240. etc refresh
 
Before ANYONE flames you, I'll let you down gently. The TVs that 'support' 240, 400, 1000hz don't actually support that refresh rate as an input, they usually only support a 50-60Hz input, and then use software to blend the frames together. The fastest monitors and TVs available commercially only allow a 144Hz input.

What I think he is trying to ask is...

When will video cards be able to handle the between frames like TVs do... which would be like double / triple buffering but also adding in between frames and showing them all to the user much faster.
 
I'm sure the way OP wrote is aware it's not 240Hz refresh rate. He's interested in the interpolation, frame insertion techniques to emulate a smoother movement which IMO is much more beneficial for computer usage in general IMO.

Doesn't for example eizo's VA 120Hz monitor do pretty much the same as well with Turbo 240 as for example that one LG TN 24GM77 TruMotion 240.

I would be pretty interested in trying to go beyond that and see what "600Hz" or whatever the TVs are at now would look like for computer gaming.
 
Well, the problem is that to add 'moar hertz' to a feed, you need to know the frame AFTER the frame you wish to interpolate.

Take for instance, say you want to 'up scale' a 144hz signal to a 288hz signal. Your video card renders one signal frame (lets call it Frame A) and you want to interpolate that with the next signal frame (Frame B) and create an interpolated frame (Frame AB): The problem is that you can't 'predict' what the next signal frame is going to be. Your video card has yet to render Frame B, so it cant interpolate between A and B to create AB. You can add a bit of latency to the whole process, lets say 1 frame of latency, but then you are trippling the framebuffer requirement. The video card now needs to store Frame A, Frame B and the placeholder for Frame AB, all the while working on Frame AB. And its doing this all the while trying to set up the render and/or rendering Frame C, so your video card is working harder/slower and the latency is increased.

Kind of doesn't make sense for real-time stuff.
 
With a 60Hz monitor, you get a new image every 16ms, with a 120Hz monitor you get a new image every 8ms, and with a 144Hz monitor you get a new image every 7ms.

240Hz would lower that to 4ms, but I'm not hearing complaints from 144Hz gamers that they want faster. Would humans be able to detect that?

Edit: I was doing so well until...
60Hz 4K monitors should (in theory anyway) be able to run 960x540 at 240Hz so I'd be interested in hearing if that's been tried for twitch games like Counterstrike.
Edit: don't post after midnight, this kind of bonehead post might happen to you.

OP, the interpolation that 120Hz, 240Hz or 600Hz TVs do is worse than useless for gaming. Interpolating a video can be done because the next frame after the current one is always known, but in gaming the gamer changes what will happen in the next frame, so inserting a smoothing image between the current image and the next image is impossible.
 
Last edited:
Ahh yea, foolish me, of course there's a big difference when we talk just interpolating a stream/video playback vs real-time rendering. It's obvious that won't work as good, dangerous to visit forums this early in the morning when the brain is still asleep. :)
 
Last edited:
With a 60Hz monitor, you get a new image every 16ms, with a 120Hz monitor you get a new image every 8ms, and with a 144Hz monitor you get a new image every 7ms.

240Hz would lower that to 4ms, but I'm not hearing complaints from 144Hz gamers that they want faster. Would humans be able to detect that? 60Hz 4K monitors should (in theory anyway) be able to run 960x540 at 240Hz so I'd be interested in hearing if that's been tried for twitch games like Counterstrike.

OP, the interpolation that 120Hz, 240Hz or 600Hz TVs do is worse than useless for gaming. Interpolating a video can be done because the next frame after the current one is always known, but in gaming the gamer changes what will happen in the next frame, so inserting a smoothing image between the current image and the next image is impossible.

Ever heard of double/triple buffering?

Technically speaking it should be possible to do frame interpolation in games. But besides the soap opera effect, there would be ghosting, warping, motion blur and other artifacts that while not affecting video could be much more evident in gaming.
 
60Hz 4K monitors should (in theory anyway) be able to run 960x540 at 240Hz so I'd be interested in hearing if that's been tried for twitch games like Counterstrike.

What about lowering the resolution would suddenly allow a monitor to quadruple it's refresh rate? A 60Hz panel should be 60Hz regardless of resolution. As others have mentioned, frame interpolation has nothing to do with actual refresh rates especially in the context of real-time interactive game-play.
 
Ever heard of double/triple buffering?

Technically speaking it should be possible to do frame interpolation in games. But besides the soap opera effect, there would be ghosting, warping, motion blur and other artifacts that while not affecting video could be much more evident in gaming.

You can easily try that on a TV. You get no better experience, just a more laggy one. I don't see any reason for frame interpolation outside maybe sports where it kinda works to smooth the movement.
 
What about lowering the resolution would suddenly allow a monitor to quadruple it's refresh rate? A 60Hz panel should be 60Hz regardless of resolution. As others have mentioned, frame interpolation has nothing to do with actual refresh rates especially in the context of real-time interactive game-play.

Perhaps he means in terms of the datastream between the devices? For example, if enough information can be sent to the monitor to maintain a smooth, constant 60 FPS at 4k (2160p) then it should be able to send a smaller amount of information (540p) faster (4x). This discussion involves more than just the throughput of the connection between the devices, however.

I know a few semi-competitive gamers that run 120Hz displays and intentionally play at lower resolutions (1600x900 instead of 1920x1080 for example) because it keeps their FPS above 120, but there are drawbacks to scaling back resolution too far, especially with the insane draw distances that some game engines are capable of. One of them complained about losing too much detail for distant objects/players if scaling resolution down too far. In the end, I suppose it is up to each player to fiddle with the settings and determine what works best for their play-style and/or level of competitiveness.
 
What about lowering the resolution would suddenly allow a monitor to quadruple it's refresh rate? A 60Hz panel should be 60Hz regardless of resolution. As others have mentioned, frame interpolation has nothing to do with actual refresh rates especially in the context of real-time interactive game-play.

I must have been really tired to, uh, let my little brother post that, yeah that's what happened no really

:(
 
Back
Top