Gamers! I pushed my LCD monitor to 84hz refresh rate!

geminox

Limp Gawd
Joined
Jan 25, 2005
Messages
173
HOLY CRAP!

This is the monitor that I have : LG Flatron L227WTG

http://www.bestbuy.com/site/olspage...&ref=06&loc=01&ci_src=14110944&ci_sku=8739118

I bought this monitor after reading the review on this particular monitor on the Anandtech LCD thread. There was supposedly no output/input lag and since I'm a very avid FPS gamer, this was the choice for me.

The thing that really made me frown though was the issue of refresh rates. I was very much happy with my old CRT @ 144hz until it finally gave its last breath. I got this LCD (finally no more back pain to/from LANS) and thought that I would be able to push at least 75 hz on native (1680 x 1050), unfortunately, that was not the case.

So I did a lot of googling, and found this lil mod that you could do to your DVI -> VGA adapter. Apparently you can bypass the EDID, the hard data that is stored on your monitor that contains fixed information on resolutions and timings, to set your monitor to any resolution you want! I warn you that this is probably not the safest thing to do for your LCD panel ( could probably cause some long term damage ) but I was able to push my monitor to 84hz on 1440 x 900 ( yes, I know, a lil far from native but good enough for me and much more rewarding in terms of gaming).

I figured this might be helpful information for hardcore FPS gamers who are constantly worried about the transition from CRT to LCD, this monitor continues to impress me, no input/output lag and a refresh rate that I've never could have imagined.

Hopefully someone can tell me or help me verify my claim, and hopefully the LCD is not skipping frames (doesn't feel like it) but it definitely feels/looks much smoother than 75hz.

DVI -> VGA EDID Bypass Mod:

http://www.overclock.net/faqs/47971-how-set-whatever-res-i-want.html

Discuss....? :)
 
There is little point in using higher refresh rates on most LCDs as the panels themselves are fixed at 60Hz no matter what the input frequency.
So despite using a higher refresh it will only display 60fps max.
There is no advantage to using a higher refresh rate on those panels.
If you want to do longer jumps etc as in some games at high fps, this technique is of no help.
This technique needs vsync to be disabled so the game itself is not locked to any refresh rate.

There is a downside to using a vsynced higher refresh rate than the panel can display, the image can appear to stutter or jerk a tiny bit.

If it works for you, thats great :)
 
I accidentally pushed my laptop to 120hz using powerstrip once. It gradually got brighter and brighter until I was almost blinded, similar to looking at a 100watt bulb. Fortunately I managed to turn it off before it exploded. Be careful when "overclocking" tfts :)
 
What exactly should I be looking out for just in case the LCD starts to bug out...? It seems fine so far...There's always great risk in going beyond manufacturers' settings and I'm sure a lot of us enjoy squeezing every ounce of performance as much as possible. It was said in an earlier reply that using a higher refresh rate would not help on this particular panel. What panel is capable or suitable for higher refresh rates...?

I would like someone to provide me some way to justify my claims/help me run some tests to see if my LCD is skipping frames.
 
I attempted express that most LCDs cannot benefit from increased refresh rates.
If you get a perceivable difference then you have a nice screen.

As for when it will fail, there isnt much information on the modes of failure from high refresh rates.
If they are hard limited normally, its not wise to push beyond what it will do at default.
Failure may be instantaneous and irreversible.
 
There is little point in using higher refresh rates on most LCDs as the panels themselves are fixed at 60Hz no matter what the input frequency.
So despite using a higher refresh it will only display 60fps max.

There are a number of smaller LCDs that can go over 60Hz without skipping frames. I was able to push an LG W2252TQ to 75Hz at native over DVI (which is quite similar to his monitor), and I'm certain it wasn't skipping frames. It wouldn't surprise me at all if the L227WTG were also capable of it.

If you want to do longer jumps etc as in some games at high fps, this technique is of no help.

This is absolutely not true. Refresh rate has nothing to do with this, as long as vertical sync is disabled. The game is still going to process those frames (whether you can actually see them or not), and if that's going to have an effect on something like jump distance it will do so regardless of your refresh rate.

What exactly should I be looking out for just in case the LCD starts to bug out...? It seems fine so far...There's always great risk in going beyond manufacturers' settings and I'm sure a lot of us enjoy squeezing every ounce of performance as much as possible. It was said in an earlier reply that using a higher refresh rate would not help on this particular panel. What panel is capable or suitable for higher refresh rates...?

I would like someone to provide me some way to justify my claims/help me run some tests to see if my LCD is skipping frames.

I very much doubt that running your LCD at a higher refresh rate will have any significant impact on its lifespan.

At 84Hz, frame skipping should be plainly visible in just about anything if your monitor is actually doing it. I've tried running monitors that are fixed at 60Hz at higher refresh rates before, and the frame skipping was immediately noticeable to me even when moving the mouse. If mouse movement and games appear smoother to you at 84Hz, it is probably not skipping frames.

Have you tried just creating a custom resolution (if you have an Nvidia card) or using Powerstrip to force higher refresh rates? It may not actually be necessary to remove those pins from the DVI-VGA adapter or a straight VGA cable.
 
I don't know if it was really faster but my 19" Planar that I bought in about 2003 would accept a 75 Hz refresh rate. I drove it with 75 Hz on DVI from a ATI 9600XT for about 5 years and it still works great. I am using a laptop right now so this screen is not being used day-to-day. If you want I can take it to my brother's comp and try it on his 8800 GT to see if it will still accept 75Hz from a different card and driver.
 
There are a number of smaller LCDs that can go over 60Hz without skipping frames. I was able to push an LG W2252TQ to 75Hz at native over DVI (which is quite similar to his monitor), and I'm certain it wasn't skipping frames. It wouldn't surprise me at all if the L227WTG were also capable of it.
As stated in the post you replied to, some can do this and if the op has a display which can, it is his good fortune.

This is absolutely not true. Refresh rate has nothing to do with this, as long as vertical sync is disabled. The game is still going to process those frames (whether you can actually see them or not), and if that's going to have an effect on something like jump distance it will do so regardless of your refresh rate..

Lol, what "is absolutely not true"?
Read what I wrote again and the very next line, I'll quote them both for you :)
If you want to do longer jumps etc as in some games at high fps, this technique is of no help.
This technique needs vsync to be disabled so the game itself is not locked to any refresh rate.
 
IMO you should mess with a screen refresh rate. A good friend of mine fried 2 nice screen doing that. Of course, if money is not this issue, go for it. You will be lowering the life of the monitor doing that plain an simple.
 
i dont see any reason to do this...the panel itself is limited to 60hz regardless of the input hz...
the only thing youre doing is shortening the screen's lifespan...
 
What benefit is there in gaming at 120hz vs 60hz? Does it feel more responsive?
 
What benefit is there in gaming at 120hz vs 60hz? Does it feel more responsive?
Difference is dead and not dead. It feels better and much sharper of course, it also helps to see better in fast turns and it could be described as "slow motion", that's how it feels to me if I play at 60hz and then go back to 140hz or similar. In extreme situations 60fps just is not enough and whoever sees more wins.

For tweaking lcds run at more than 60hz, it will just increase input lag because panel processes signal back to 60 anyway. There is no way to get more than that so not even worth trying. You can input more, but output will be the same.
 
novemberrain said:
For tweaking lcds run at more than 60hz, it will just increase input lag because panel processes signal back to 60 anyway. There is no way to get more than that so not even worth trying. You can input more, but output will be the same.
Not all LCD monitors are the same. Some can do more than 60 Hz without skipping frames. If it skips frames, it should be obvious just by moving the mouse or dragging windows around because motion becomes jumpy. If it doesn't skip frames, motion becomes smoother. In either case, there should not be more lag.





silent-circuit said:
It'll kill the panel. Pure and simple.
No it won't.

exe said:
This is likely, and doesn't take long.
Based on what?

Somnambulator said:
i dont see any reason to do this...the panel itself is limited to 60hz regardless of the input hz...
Based on what? Did you test this particular panel?

Somnambulator said:
the only thing youre doing is shortening the screen's lifespan...
Based on what?

Why are people responding with nonsense?
 
I thought the panel wasn't the limiter but it was the controller that is the actual limiter. Also if 60hz is the absolute cap for LCDs; why is 75hz able to be selected without modification on many LCD's while still using regular DVI-D?
 
It was more likely of a happening that ultra high refreh rates would damage a CRT than an LCD. I still remember the shrill high pitched whine a CRT would make when I went "out of range" and the weird, distorted picture. LCDs don't have these problems. They just don't display an image, simple as that.

DVI is limited by bandwidth, and VGA is not, so have fun with your "OC'd" display Geminox. If that monitor dies, it's likely due to defect not refresh rates.
 
Anytime I use a laptop the first thing I do is see if it'll accept a higher refresh rate. I know there's some line of thinking that it makes no difference, 60 or 75 Hz, but my eyes - after decades of staring at 100+ Hz displays for graphics work - can easily spot the difference in many ways, and if the laptop supports it, 75 it is. The display becomes a bit snappier, windows refresh faster, and it just "feels" more responsive overall.

Sure, it could be just a placebo style effect on some levels, but I could care less. It's what I prefer, and it works, so that's what I use.
 
Anytime I use a laptop the first thing I do is see if it'll accept a higher refresh rate. I know there's some line of thinking that it makes no difference, 60 or 75 Hz, but my eyes - after decades of staring at 100+ Hz displays for graphics work - can easily spot the difference in many ways, and if the laptop supports it, 75 it is. The display becomes a bit snappier, windows refresh faster, and it just "feels" more responsive overall.

Sure, it could be just a placebo style effect on some levels, but I could care less. It's what I prefer, and it works, so that's what I use.


Totally placebo effect on being able to SEE the refresh rate on a LCD, because there is NO refresh rate of a LCD :).


The response time might be a measureable thing though :)
 
I sure am glade that I don't notice a difference between 60 and 75 it sounds like a pain in the ass/ waste of time.
 
I sure am glade that I don't notice a difference between 60 and 75 it sounds like a pain in the ass/ waste of time.

It only makes a difference if that is something you look out for ( high motion FPS or even non-FPS gaming ). Most people will use a computer to do basic things, but why would an average person want to squeeze every ounce of performance out of something if they are just going to do very basic tasks on a computer...

Might be a waste of time for one person, but a world of difference to another... I mean thats the whole reason why those 120hz TV's exist ( wish they were "true 120hz" instead of intermediary frames )...

The 84hz refresh rate seems to be the limit on Widescreen 16:10 at 1440 x 900 for my panel, and as far as input/output lag, there does not seem to be any. This feels virtually like my CRT as far as motion goes... such an awesome monitor.
 
Anytime I use a laptop the first thing I do is see if it'll accept a higher refresh rate. I know there's some line of thinking that it makes no difference, 60 or 75 Hz, but my eyes - after decades of staring at 100+ Hz displays for graphics work - can easily spot the difference in many ways, and if the laptop supports it, 75 it is. The display becomes a bit snappier, windows refresh faster, and it just "feels" more responsive overall.

Sure, it could be just a placebo style effect on some levels, but I could care less. It's what I prefer, and it works, so that's what I use.

Indeed. I do notice a difference, don't really think that it's "placebo". As far as it goes for your eyes, AFAIK it is better to have the higher refresh rate - less tiring for your eyes. I think I notice that actually.

Totally placebo effect on being able to SEE the refresh rate on a LCD, because there is NO refresh rate of a LCD :).


The response time might be a measureable thing though :)

What does the refresh rate actually mean on an LCD then? :confused:

addon:
Not all LCD monitors are the same. Some can do more than 60 Hz without skipping frames. If it skips frames, it should be obvious just by moving the mouse or dragging windows around because motion becomes jumpy. If it doesn't skip frames, motion becomes smoother. In either case, there should not be more lag.






No it won't.


Based on what?


Based on what? Did you test this particular panel?


Based on what?

Why are people responding with nonsense?
Well, "based on what evidence do you come up with all of this? Mind telling us, just curious...

I sure am glade that I don't notice a difference between 60 and 75 it sounds like a pain in the ass/ waste of time.
A pain in the ass/waste of time? What does that meant? :confused:
 
LCD and CRT refresh rates are not comparable for the fundamental difference between the way pixels are lit.

On a CRT, a phosphor coating on the inside of the glass is excited by the electron gun. This lights up the phoshor and you see a lit pixel - however - phosphor will only stay lit for a fraction of a second before fading, hence you need a high refresh rate so that the electron beam scans past again before it does so. Low refresh rate = flicker fest.

On an LCD, a current is applied to liquid crystals which bend/twist to let light through, since this current is constant, until the pixel needs to change it stays lit, no flicker.
 
On an LCD, a current is applied to liquid crystals which bend/twist to let light through, since this current is constant, until the pixel needs to change it stays lit, no flicker.
I think you understand what were looking for in this thread, but still. Problem isn't flickering, actually I wouldn't mind that at all. LCD is only of changing every pixel on the screen every 16.67ms (60hz, 60 times per second) and this clearly isn't enough in some cases. In fps games a 180 degree turn easily takes less than 0.1s (I use relatively low sens), which at 60hz is less than six frames. At 150hz for example (I use 100-150hz depending on resolution, 100@1080p and 150@720p) it equals 15 frames. In some cases I've measured from demos reaction times very close to 0.1s and in some cases from first rendered frame where enemy is seen it takes only 0.15s to react, move mouse and shoot in a situation where enemy is expected from some direction. And this is without thinking of input lag or any other things.

CS:S for example uses 33 "ticks" as default. Information is sent to client 33 times per second. Most servers use 100 ticks though, because 33 is not considered enough. Also 66 is widely used depending on the computing power and bandwidth available.
 
Refresh rate has NOTHING to do with the panel. You are changing the signal going to the monitor not the refresh rate of the panel. Unless a monitor is programmed to accept and respond to such a change, nothing will happen.
 
The common complaints with LCDs that high speed FPS gamer have had are motion blur and "responsiveness".

Motion blur is improving, though it will probably never be totally gone due to the "sample and hold" effect of LCDs vs. CRT.

Responsiveness improvement is why people look for high refresh LCDs. Since DVI cannot support super high refresh rates, some people use VGA on LCD to get high refresh rates. As novemberrain mentions, higher refresh rates can more clearly show more motion.

If the mouse is not jerky or "jumpy" when the higher rate is selected, that means the panel supports the higher refresh rate. If the panel supports it, it won't be damaged.

The ONLY component that could possibly be damaged by high refresh rates is the "analog to digital" controller that handles VGA to digital conversion, and the likelihood of that happening is likely quite slim. Nothing could possibly happen to the panel that I could think of.
 
This thread is filled with so much misinformation that it's ridiculous. Some LCD's can indeed support higher than 60hz without skipping frames, and yes there is a real benefit to having a higher refresh rate in FPS games. Sorry but the people who disagree simply don't know what they're talking about.
 
This thread is filled with so much misinformation that it's ridiculous. Some LCD's can indeed support higher than 60hz without skipping frames, and yes there is a real benefit to having a higher refresh rate in FPS games. Sorry but the people who disagree simply don't know what they're talking about.

Isn't that what I just said?


From a technical standpoint, refresh rate means nothing to the LCD itself other than allowing you a higher FPS cap if your LCD supports > 60Hz.
 
I'm about to break it down for you guys something that needs to be done!

A Cathode Ray Tube is nothing more than a vacuum tube with a anode and a cathode. it just has the anode and cathode in diffrent places and also has a phospher on one end. What happens is there is an amplifier and a highvoltage (flyback transformer) inside.

To make things simple(which will put me on the verge of being told where im wrong at) I'm going to explain how it displays the image. The signal wheather it is vga,dvi, bnc,analog are digital. will be sent to the amplifier where it will be amplified split and proccesed then sent to the flyback transformer for control and sent thru the electron gun. the electron gun is a single point that electrons fire off of and fly thru a vacuum(so that there isn't any ions to abosrb it) before hitting a particuler phosphor at one period of time. which depends on what frequency the aperture grill is at and what frequency the electron hit it among other things is at. when the electron hits the certain phospher depending on what quantim state and how big the forbidden gap is and at what certain threshold frequency the electorns have when it strickes the phosphor a photon is released toward the oppsite side of the screen spinning into the atomsphere before hitting your corronas cinching the hell out of them. Since the electron gun can only point to a certain group of phospurs (depending on screen res)at a certain time then it has to strobe back and fourth by the use of finely tuned electromagnetics that are used to push and pull the stream of electons.
there are those that are used to move it up and down and there are those that are used to move it back and fourth.

I can't throughtly explain this without saying a certain res. let's use 800by600 once the gun has moved down 800 time from the top left of the screen and over 600 times and come back to the very top left then it has completed one cycle are hz. phosphors have the tendancy to want to decay from there orginal state from when the electron stream hit it. they tend to go back down to there natrual state below the photoelectron threshold. usually by the time the electron gun is at the bottom of the screen the top of the screen is already 70 percent decayed. to compensate we can turn up the refresh rate! higher refresh rate mean phosphors are more stable on a particular color. It also means that it will update from the video card for a smoother image. a crt can change frames midways between with a smooth transition from on frame to the next this is why we can apprecaite those extra headshots. from a crt at 100 hz. This is not vsync this that is where the videocard starts drawing a certain frame before the last one is finished. this happens on a much faster rate. this is when the electron gun is halfway down and the video has no vsync problems and changes midway to which the electron gun draws a portion of the frame and then starts drawing the last poriton of the frame. (to complicated to explain) just dont aregue that its vysnc its not. vsync can happen in thru 3-4 frames whereas this happens are transistions i must say wihtin 1 hz of the 60 hz (are whatever you have defaluted) this is that flicker you see.if you turn up your resolution then its harder for the deflectors to push and pull your electron stream steady at a higher resolution thus you have to turn it down so there is no blurring! are use more of everything electornics wise like the fw900 and have hellacious back problems.

The reason that crts produce more accuate colors is because the phosphor creates the photon. unlike an lcd which blocks them. when you turn up your referesh rate you have less flicker which hurts your eyes you get exaclty at one time what yoru video card is produceing (depending on your connection lag).also you can take certain pins off of your cable nothing new and be able to run in resolution your video card can handle althought you may bring your crt down to as little as 9 hz not very good for your monitor i assure you. yes we have had high definiton on crt's for years and even more so than your 720p 1080p stuff.

An lcd works diffrently.duh!!!

lcds work by haveing a back light that can be dynamic are static. they work on the process of deflecting light. every particlular color within humans can see has a particulare frequncey(and those we cant't see) what we do to make things simple wiht an lcd is uses a crystal to oscillate back and fourth to counteract the color. depending on what frequency its oscillating at depends on what color the white backlight is comeing thru and changeing to . the crystal is at it's fastest when ever it is showing black (i think it may be white but the process and the point are still the same.:) so when its osiclating the fastest its using the most power. which is the blackest the montour can get. crystal have a transition time between oscillations. this we refer to as milliseconds 2 for tn 6 and above s-ips are whatever doesn't matter. unlike a crt which can group a particluar finite number of phosphors into whatever resolution it wants a lcd cannot. it has a certain number of crystals with transistors for variable voltages depening on if the transistor is off are fully saturated depends on the frequency at which the crystal is osccilating. this boys and girls is what we call a pixel. there are number of them on things we call panels. lcd's are crippled by the fact that we can't change but from one resolution. but they do have scaler chips but cause a whole lotta of aliasing. and a whole lotta input lag. to get to the point. when hooked up in dvi to make things simpler each pixel is driven buy a certain bit directly from the video card. crystals are a piezoelectric therefore they vibrate when current is flowing. although current is flowing electrons travel at like 1 mile per year literally current flows near the speed of light. the electron itself is what make the crystal vibrate but the rate of the current is what causes it to oscilate at a certian frequency along with the thickness of the crystal cut. on an lcd the entire screen is updated at 1 frame at time 60 times per second. 60 hz. so there fore you have 60 frames per second wheather your video card is runnin at 30fps are 80 fps. if your running at 30 then every 2fps second the screen shows the same frame. just becasue the screen is updated at 60hz doesnt mean the crystals themselves have to be. lets take a 60 fps on an lcd where the crystal wihtin each pixel takes 45 ms to transiton. lets say that the crystal is oscilating at a certain frequncy and the screen updates. all at once the pixels start changeing to to the new colors for the new frame.1 time they almost get there but the frame changes again then they start oscialting towared a new color 2 times but nope they can't get there in time. then it happens again 3rd time but to no avail it can't get there in time.so you have an lcd at 60 hz that the crystals are slow to transition from one color to the next and they just cant get there. and we have a blurry as hell screen.

lets take a video card produceing a frame at 80 fps every certain ms and running at 60 hz on a crt the screen would flicker and the crt would allow you to push 80 fps at 60 hz it would just flicker. an lcd would not it is limited by the 60hz cap and because 1 frame is one hz 60 hz per second so it frame skips.also during that time the crystals are oscillaing at say 2 ms. the crystals themselves can transition 4-6 times within each frame on gray2gray and 1-2 time on black to white. let say that we had a solid 2ms on every color. and 10ms frame and a 60 hz signal. you could change 5 times wihting that 10ms frame theroticly.and not have any blurrying. but the crystals are limited to 60 hz input. lets say we were to up the screen to 120 hz. what would happen unde these conditions. it would take the cap off of the crystals and they wouldn't be waiting for a signal at 60 hz to tell them to change color they would change at a faster rate. maybe not at 120 hz but maybe somewhere around equal to a crt 72 -80 hz. and if you were to overdrive them even more than the manufacturers are doing now you could acheive even closer to a 120hz screen.


you silly consumers a 60 hz signal has nothign to do with the rate at which a crystal can osciallate. they don't wont to change them as of yet its marketing. all they would have to do would be to change the the thickness of the crystal being cut. simple as that. have you ever noticed that the 120 hz screens are snappy then slow snappy then slow. with all kinds of problems. the pixels themseves can do 120 hz hell thats easy a crystal in a soundcard oscillates from 20 to 20,000 hz doesn't it are correct me if im wrong please do! they are snappy because the signal is 60 hz and they are produceing 120 hz off of this signal.how canyou double your fps without a scaler chip? if there isn't something driving it directly. the scaler chips used in todays monitors are exacly what is keeping us from hitting 120 hz. screens. you think becasue you can see the pixel(screendoor effect) that it limited to 60 hz no way. that is so freaking slow. what you need is a montor with a good scaleing chip. i have the l227wtg-pf and you wouldn't tear it from my cold dead hands.
BECAUSE I WORK AT FREDS INC MAKEING 800 DOLLARS A MONTH AND IT TOOK ME 3 YEARS TO SAVE UP FOR THIS PUPPY.

ok so yes you can have your monitor go faster than 60 hz its that manufactures need to produce monitors without scaler chips and the chips need to be manufactured in the video cards. yes my monitor can do 75 hz and i have no frame skipping. have you ever noticed that the image gets smoother whenever you have a lower response time no matter if its set at 60 hz are not. well once you get down to around 2 ms it can't go any faster then 60 hz thats why we are stuck with no progress below 2 ms response time we are waiting for scaler chips that can go faster than 60 hz maybe 120 hz scaler chip and we can go to 1 ms . so you guys and gals who have monitors at 6 ms don't even attempt to go past 60 hz you are no where near the 2 ms border. you won't notice a diffrence! but for you guys who are lucky enought to have an l227wtg-pf like me who are proud of it. then we can enjoy 1:1 pixel mapping at all resoultions no yellow tinged whites and with a 2 ms response time. by the way modern panels that are tn within the last 2 years are way better than they were years ago. and even thought it has a viewing angle like all tn's its ability to give me clear games and deep vivid colors is far superior to me than an h-ips that is cap limited at around 50 hz. when you pull those pins out it lets your scaler chip and edid not no what you are doing therfore allowing you to take advantage of your pixels repsonse time not your fps cap you can have up to 5 transitions wihtn 1frame on an lcd but teh scaler chip and firmware frame cut it.
I understand 1 frame is like 17 ms so thretocly a 2ms crystal can change 8 times within each fram with this limit if all colors had the same transition time. at 60 hz 60 x 17 equals 1020 so if you were to divide this by a 2 ms crystal then you would get 510 transitions if we didn't have a cap on a crystal. at 1 frame but since they are all updated at once we have some crystals already finished transitioning and others not. depending if the scaler chip were programed to allow indiviual pixels to change instead of as a whole would make a big diffrence. this is why we have fps cap at 60hz and why people changeing to 75 hz are getting better results. if the scaler lets the signal bypass to the pixels then you get a smoother picture if it doesn't then your just pumping 75 hz to a scaler that brings it down to 60 hz and skips luckly my monitor doesn't and allows it to pass! people who have to high response time can't take advantage. and for those people who have tried it on there monitors using dvi try switching back to vga to see if you can get a smoother image i can on both dvi and vga but some can only on vga
 
While all that is really technical (I did read all of it), and mostly accurate, it's really a moot point.

When you change the refresh setting in Windows, all you are doing is tellling the video driver to set the card to output that signal to the monitor. When the monitor recieves that signal, it responds by entering that display mode. If it does not support that display mode, it either defaults to something close or a safe mode, or gives you the "out of range" message.

Bottom line, the monitor must support the mode electronically in it's hardware to recieve any benefit from a higher refresh rate. In the case of an LCD, if it does support the higher rate, this will increase the frames displayed and raise the cap placed by using vsync. But the pixels (crystals) themselves still only respond at a given rate, based on the controller and voltages recieved.
 
vick1000: When you change the refresh setting in Windows, all you are doing is tellling the video driver to set the card to output that signal to the monitor. When the monitor recieves that signal, it responds by entering that display mode. If it does not support that display mode, it either defaults to something close or a safe mode, or gives you the "out of range" message.


That is correct on an lcd monitor that hasn't had the edid pins removed. But there's more to it then display resolution there's also refresh rate data as welll. Have your removed your edid pins and what type monitor do you have?

vick 1000: Bottom line, the monitor must support the mode electronically in it's hardware to recieve any benefit from a higher refresh rate.


the panel's themselves do. it's the scalerchip/firmware revisions. and the type of panel matrix voltage array that could be holdeing them back depending on if your pixel voltages are being overdriven close enought to the 2 ms barrier that we currently have. if not lets say that you have a panel thats fastest response time is 6 ms and slowest is 15 then no your not going to see much of an imporvement even if your scaler allows it.

vick 1000: In the case of an LCD, if it does support the higher rate, this will increase the frames displayed and raise the cap placed by using vsync.
But the pixels (crystals) themselves still only respond at a given rate, based on the controller and voltages recieved.


Thats the point i'm makeing by takeing the time to type all of it out. some pixels are being overdriven by the manufacturer past the point that your putting a cap on them by not running your refresh higher than 60 hz. the majority of these panels are tn based but who cares. some people like this threadstarter understands that. Ive been doing it since i got my monitor I like to oc anything i can get ahold of as well. It's my hobby.

Why would you call my explanation moot! Then try to resay what i said less technical. It's called Hardforum.com

Yes you can tear up your monitor if you try changeing the acutal pixel voltages but by rasing the refresh rate your just not allowing the pixels to have any time to stop and wait for a new frame between oscillations.

The only forseeable way that you could would tear up your screen would be to try to change the voltage overdrive settings thru a firmware hack are to enter service mode and maybe change a panel type which for a specific firmware revision may have diffrent pixel voltages. there may be 5-10 diffrent panel types within a certain monitor which could turn your screen completely white or black are kill your crystals by the wrong voltage. What im talkign about is just using the voltages that are already there and set in by the manufacturer which are for your specific panel overdriven are not! It wouldn't do a thing to hurt your display! your not changeing anything but the input signal (which on some monitors depending on the scaler hardware and firmware might skip frames are let the pixels change to the next frame!)

other thant that just like 10e Said nothing will happen! The worst thing that could happen would have a chancce slim to none.

Besides when overclocking there is a risk! Enjoy life bro you only have one!
 
It only makes a difference if that is something you look out for ( high motion FPS or even non-FPS gaming ). Most people will use a computer to do basic things, but why would an average person want to squeeze every ounce of performance out of something if they are just going to do very basic tasks on a computer...

Might be a waste of time for one person, but a world of difference to another... I mean thats the whole reason why those 120hz TV's exist ( wish they were "true 120hz" instead of intermediary frames )...

The 84hz refresh rate seems to be the limit on Widescreen 16:10 at 1440 x 900 for my panel, and as far as input/output lag, there does not seem to be any. This feels virtually like my CRT as far as motion goes... such an awesome monitor.

Exactly!!! i couldn't have said it better myself
a223c6b3710f85df22e9377d6c4f7553.jpg
6c1da886822c67822bcf3679d04369fa.jpg
0e3a37aa85a14e359df74fa77eded3f6.jpg
 
Holy christ man, thank you for your well of knowledge :)

Today, out of plain curiousity, I decided to see how high I could get my refresh rate on my native 1680x1050 resolution, and I was able to hit 76hz on DVI !

This is much more comfortable now than 60hz... but are/were you able to push it any higher than that? I haven't tried cutting out the EDID pins on my DVI cable, it's only cut for the DVI->VGA adapter... Will I see results if I try this on the cable itself?

I can hit 84hz barely with the D-SUB/VGA cable... DVI seems to have some kind of limitations for some reason ( glass ceiling somewhere? )

If so, please share your experiences/results from it... it means I'll have to buy another DVI cable to tinker with ;)
 
geminox:

Have you tried forcing higher refresh rates with PowerStrip or Nvidia's custom resolutions on a normal cable, or are you already doing that? I'd still be somewhat surprised if cutting those pins off was the only way to do this.
 
Back
Top