Is this true? DVI question

SpongeBob

The Contraceptive Under the Sea
Joined
Jan 15, 2011
Messages
939
I read this on another forum and I'm curious considering a lot of us I would imagine use standard hdmi or single link dvi. So are we only getting 60 fps in reality? (Physically what our monitor is showing us?) Or is this wrong. I'm curious considering if it is makes me think everything I should play I should put vsync on or something.

DVI single link supports 1600*1200 at 60 fps at max
DVI dual link supports 2048*1536 at 85 fps at max

thanks for clearing this up
 
Disregarding frame rates for the moment, above resolution values are plain wrong. Leads me to believe the rest is wrong too.
 
In what sense are they wrong?

1600x1200 @ 60 Hz was at first considered to be the roof of DVI, but reduced blankings have improved on that.

I'm not sure about the question, the frequency in which the monitor is displaying the content is the same as the maximum number of frames per second it can handle (the number of times the image is refreshed every second). This is usually 60 frames per second but it varies and at lower resolutions higher frequencies are possible (in theory).

Ultimately it depends on the monitor (and of course if the bandwidth of the interconnect can manage higher frequencies) and most monitors only support 60 Hz. There are of course the new 120 Hz monitors and few monitors can be used with 75 Hz or 85 Hz.

The problem with vsync is that it can hinder performance if your graphic card is not up to the task to always render at 60 fps (or whatever frequency is used) or higher.
 
I guess I'm confused. So if say we play a game like Left 4 dead and we get like 200fps or so it shows us in game. But our monitor only gets 60fps or so then what do we really see?
 
I guess I'm confused. So if say we play a game like Left 4 dead and we get like 200fps or so it shows us in game. But our monitor only gets 60fps or so then what do we really see?

hz = max fps the monitor can display
so on a 60hz monitor you get 60fps max displayed
 
So if your able to get 100+fps in a game would it be best to turn on vsync?

The nasty screen tearing effect occurs when the framerate exceeds monitor refresh rate, so I always try to leave Vsync enabled especially if im getting over 100fps.
 
or just turn up game settings like AA and AF and triple buffering if you can to lower the FPS.
\
your monitor can only display a certain amount of FPS.
 
Correct me if I am wrong but, lets say you have your refresh rate locked in @60hz, with vsync enabled your fps in game will be locked @ 60fps, but the image will be nice and smooth with no "tearing". With vsync disabled the sky is the limit regarding the FPS you can achieve, but tearing will occur.
 
Correct me if I am wrong but, lets say you have your refresh rate locked in @60hz, with vsync enabled your fps in game will be locked @ 60fps, but the image will be nice and smooth with no "tearing". With vsync disabled the sky is the limit regarding the FPS you can achieve, but tearing will occur.

yes but its causes noticeable input lag under 60hz
and of course heavy framedrops / low fps if the system isn't fast enough
 
yes but its causes noticeable input lag under 60hz
and of course heavy framedrops / low fps if the system isn't fast enough

This is interesting since I've noticed in some games vsync is great others just terrible.

AA and AF I rarely touch I dunno seems so finicky too bad we can't set our settings to increase or decrease AA or AF depending on the fps we want to see.
 
Back
Top