Gamers! I pushed my LCD monitor to 84hz refresh rate!

The reason dvi has this limit you are talking about. to be honest i haven't got the foggiest why! I think it has to do with the bandwidth of dvi vs vga not being bandwidth ridden persay but can have alot of artifacts at higher resolutions including ultra blurryness from the clock and phase. Dvi being digital to even work without artifiacts has to be set in stone at each and every resolution a perfect signal. The scaler ( i say scaler its a number of hardware inside) has to process the digital signal if its scaleing it. we are talking 1680 by 1050 signal downscaleing. have you ever had say a 486 dx with 16 mb's of ram in it? well if you tried scaleing an image that big down to half it size it would take a couple of seconds. (diffrent process than rendering a picture).were talking 60 fps are more that the monitor has to downscale in real time. with dvi it has to be perfect. with vga it doesn't it just goes thru the same but diffrent way in the scaler. you can have more artifacts but for the most part get more refresh. What you are doing upping the refresh is causing the scaler to process more info per second. this is why dvi needs 2 cables whereas vga needs one most of the time.

2 bad we dont have a 22 inch with 2 dvi's and a 4 scalers for each corner:).

I know a little about a lot. i'm only 23 but i learned alot from a dear friend named wilhiem hartzfeldt. he taught consumer electronics and industrial electronics at a technical college where i'm from. He knew everything the guy made an oscilliscope from a old blackand white tv once. He was what you would call a guienus. he was teaching at like 76 when he passed away. this guy knew dr lee deforrest in person. Actully worked in his lab.

This stuff is all marketing. he said they had 3d tv back in the 50's

lets put it this way. I learned everything from the 6 phases of matter bose-einstien,solid,liquid,gas,plasma,either which are all the same thing. all the way to how to make a pn junction with selinem and gold powder. this guy could literlly build a tv from the ground up. he was a personnal friend. you won't recieve much of a diffrence just leave yoru cable alone! useing powerstrip are modding your cable won't make the diiffrence as much as your monitor would. i use the vga cable myself instead of dvi. the signifcant diffrence it made on my monitor was worse(tad bit sharper) than being able to push a higher refresh. I wouldn't even worry with a dvi on this particular monitor.
 
Removing the EDID pins will just prevent the OS from limiting the output modes (or worse reduce modes to standard VGA only), the EDID is sent to the video card from the monitor, not the other way around. You can force unsupported refresh rates anyway, through a software bypass.

This still does not alter the controller in the monitor, hence my statement that all your data is moot for the purposes of this discussion.
 
...but by rasing the refresh rate your just not allowing the pixels to have any time to stop and wait for a new frame between oscillations.

...

That is false, unless the monitors controller is programmed to use a higher refresh rate, as I previously stated.
 
vick1000 :Removing the EDID pins will just prevent the OS from limiting the output modes (or worse reduce modes to standard VGA only), the EDID is sent to the video card from the monitor, not the other way around. You can force unsupported refresh rates anyway, through a software bypass.


That is correct removing the edid pins will prevent the os from limiting the output modes to receive a higher refresh rate to unlock the abilty to get a smoother image. hence what the entire thread is about. once the pins are removed are through doing the same thing through powerstrip which i said in my last post it will unlock the ability to push a higher refresh rate at native resolution to the max the controller can handle also when the pins are removed it doesn't matter if the monitor sends to the video card. are if the video card sends the monitor. you can then send a higher refresh rate.


vick1000:This still does not alter the controller in the monitor, hence my statement that all your data is moot for the purposes of this discussion.


If you read back i said that the controller/firmware and scaler is the very thing that is holding back some monitors from achieving a higher refresh rate. i stated that some controllers/scalers will allow the refresh rate to go through others won't depending on the monitor. Think about it like this 2 same panels in 2 diffrent monitors one has lag the other doesn't are has signifcant lag. what causes this? The controller!

when i said "but by rasing the refresh rate your just not allowing the pixels to have any time to stop and wait for a new frame between oscillations."

you commented.
vick1000:That is false, unless the monitors controller is programmed to use a higher refresh rate, as I previously stated.


I'm not arguing that it wont the whole point of what i'm saying is that it will. and some monitors can like the l227wtg-pf.

also in post 28 you said

vick1000: Refresh rate has NOTHING to do with the panel. You are changing the signal going to the monitor not the refresh rate of the panel. Unless a monitor is programmed to accept and respond to such a change, nothing will happen

well guess what removieng the edid pins removes the programming. which allows us to run higher refresh rates. which allows pixels that are 2ms gray to gray and 5-8 ms for everything else. to change at least 2 times within a 17ms frame at 60hz. allowing you a smoother image as you move up in refresh rate if and only if the monitor allows why are you arguing with what i'm stateing and how is this moot! im not the only one who has come to this conclusion. you should take what other people say and learn from it not attack every little thing they say and try to disprove them. you come up with some kind of proof and i'll quit responding to your attacks.

why the hell would anybody call someone's point moot. if i see as well as alot of other people something i don't like are agree with then i usually move on unless i know there wrong. why won't you answer the questions i asked you in the post back and tell me exactly every little thing i said that was moot to this discussion
 
also if you choose not to remove the pins other software like powerstrip and others will allow you to bypass some functionality of the programming(not all) . others won't and will like vick1000 said revert back to vga only to protect the monitor from being damaged. although a lcd will not likely be damaged like a crt would be from upping the refresh too much.
 
On XP you can load a different monitor driver that supports higher res with higher refresh rates on tap.
(I'm not sure about Vista as I'm not running it here but you can try)
Be sure to untick "show compatible hardware" while manually installing the driver and then you will be presented with all the installed drivers to choose from.

This will allow you to select almost any mode you want.
For example, load the Iiyama A201HT VisionMaster Pro 510 driver and you will get up to 85Hz at 1900x1200 and even higher refresh rates at lower resolutions.
Strangely, for me it only gives 60Hz at 1680x1050 res so you may need to try another driver if that is your res, but you get the idea.
 
well guess what removieng the edid pins removes the programming. which allows us to run higher refresh rates.

It does not remove anything except the EDID data bieng sent to the OS. You cannot alter the controller or pixels by changing the refresh rate. Unless the controller is programmed to respond to the requested mode. In which case a simple software bypass of the EDID will allow the request, even if the firmware does not.

Why must I provide "proof" of my claims while you do not?

I'm saying all the information you provided in the origianl post I targeted is irrelivant (moot)because it does not offer any data concerning refresh rate and pixel response, other than a blanket claim that they are related. The bulk of it is your description of how a CRT and LCD display works, while ommiting any relivant information concerning EDID and refresh rate.

Never mind that refresh rate and frames per second, have nothing at all to do with pixel response times. The pixels always change as fast as they can when they are instructed to do so (overdrive aside), regardless of the requested refresh rate. Of course if the controller were to request the panel to refresh 75 times per second, they indeed would do so, but they still cannot change state any faster which is the problem with LCDs.

If you are claiming this particular model will request a panel refresh 84 times per second, where is the proof? All I have seen from you is a massive post with little to no relivant information, and a subjective claim from the topic creator.
 
vick you can claim it doesn't but untill you mod your cable then get a monitor capable of doing it. stop stateing that it can't be done. i'm already doing it. that is not everything the edid sends to the computer either. but it doesn't matter. i won't argue with you anymore.
 
Can't you do the exact same thing using PowerStrip? Or even without any software, some registry hacking..?
 
yes you can but to enable enable other settings (or i should say disable set refresh rates) its better to remove pins but somewhat pointless as well. yeah thats what i use myself but i use a vga becasue of its ability not to be limited by the dvi input. i can hit 73 hz on 1680 by 1050 without any artifacts. also depending on the monitor depends if it will frameskip are not. some controller/scalers will take the higher 73 hz and skip frame it back down to 60 hz so all your doing is sending a higher refresh signal that is doing nothing. but on some monitors controllers like the (l227wtg-pf) will take the refresh and (not let the pixels themselves set waiting on the next frame to osccillate) it will allow them to start changeing to another color 73 times per second instead of 60hz if set at 73 hz the reason is becasue a 60hz display will only allow the pixel all at once to start changeing. 60 times. also a frame is therotely 17 ms so if you have 2-8 ms pixel repsonse time max then you could actually change your pixels 2 time or more within that frame. so by bringing your refresh up to 120 hz double that of 60hz you arent doing anything to the pixels except allowing them to not change at a slower rate. your persay unlocking them to a smoother image. not doubleing your fps but haveing a smoother image up to 120hz. what is holding me back now is the analog signal and the controller not able to give each pixel the information it needs fast enought without artifacting. it can't change the analog signal fast enought accurately after that.

edit:sorry about the repeat but im trying to come up with an easier response
 
vick you can claim it doesn't but untill you mod your cable then get a monitor capable of doing it. stop stateing that it can't be done. i'm already doing it. that is not everything the edid sends to the computer either. but it doesn't matter. i won't argue with you anymore.

You cannot change the firmware or behaviour of the controller by eliminating the EDID signal, all you are doing is bypassing a "lock" on the output signal of the video drivers.

I don't know what you think you are doing to your monitor, but you are not changing anything in it, software or hardware, just bypassing a software refresh rate lock. It's capabilities remain the same regardless, and again there is no need to modify hardware to bypass the EDID signal.

Having an LCD image updated 73 times a second as opposed to 60, is also a subjective matter. On a CRT, 60hz was very noticable since the gun had to sweep the screen, causing a flicker. While an LCD updates the entire image at once, 60 times per second. The difference may not even be noticed by most users.

In any case, the only benefit is as I stated, a higher frame cap using vsync. The pixels still "ghost" or "smear" at 75hz as much as at 60hz. Don't you think that is higher refersh rates equaled better performance, that it would have caught on and we would have high refresh rate LCDs all over the place?

Do some research on black frame insertion, those panels update 120 times per second.
 
the reason that 60hz is the standard is because most of the film industry uses television recordings of 60 frames per second 3-2 pulldown and 25 fps film doesn't work well with anything higher and as a manufacturing standard to keep prices lower they stick to 60 hz. for all lcd's becasue alot of the hardware has a lot of the same chips.

In answer to your question if they're is no diffrence then why even bother 75 hz on 1440 by 900 that alot of newer monitors have. if thats the truth that there is no diffrence. And since the manufactures would do something about it if there was better. why do they give you the option to pump 75 hz to your screen at that resolution. if all monitors as you say don't get any benefit from it?

also this lock on the signal that im bypassing im glad you finally came to the conclusion that we can do this.

And depending on the monitor if it has a low response time will you see any diffrence. your response time has to be at least twice as low as 16 ms to receive anythign smoother. if even one pixel transiton takes longer say 24 -29 then you are correct you will have a smear are ghosting as you say.

by the way black frame insertion are bfi is a gimmick all they are doing is takeing the period of time between each 60 hz frame of the 60hz signal which is what im talking about and causing all pixels to turn to the color of the frame provided then to black untill the next frame comes along. it stimulates a flicker effect. only tn's will benifit from this tech becasue they are near the 1.73 ms border that we need in order to go higher than the 60 hz refresh rate. why not use this to just do what i state.

edit: i took the time to learn a little bit about bfi a little more. turns out they will do it the way i typed for tn's but for va's and ips they will strobe the backlight in order to produce a lcd with percisstance of visioin to simulate a crt. The reason they are doing this isn't to flicker everyones eyes its to take the wasted oscillating time all the way up to 16.7ms and emulate frame doubleing without any extra proccesing. instead of actually drawing the new frame during the period of time i was talking about. they use image percisstance to make your brain do it.

feel sorry for those epileptics who arn't to fond of screen flicker! lol

in doing this they can emulate 120 hz without actully changeing the 60 hz signal standard.

also the standard for crt was 85hz to help epileptics.
 
When was the CRT standard not 60hz? When your PC boots up, your POST screen is at 60hz.
 
the reason that 60hz is the standard is because most of the film industry uses television recordings of 60 frames per second 3-2 pulldown and 25 fps film doesn't work well with anything higher and as a manufacturing standard to keep prices lower they stick to 60 hz. for all lcd's becasue alot of the hardware has a lot of the same chips.

In answer to your question if they're is no diffrence then why even bother 75 hz on 1440 by 900 that alot of newer monitors have. if thats the truth that there is no diffrence. And since the manufactures would do something about it if there was better. why do they give you the option to pump 75 hz to your screen at that resolution. if all monitors as you say don't get any benefit from it?

also this lock on the signal that im bypassing im glad you finally came to the conclusion that we can do this.

And depending on the monitor if it has a low response time will you see any diffrence. your response time has to be at least twice as low as 16 ms to receive anythign smoother. if even one pixel transiton takes longer say 24 -29 then you are correct you will have a smear are ghosting as you say.

by the way black frame insertion are bfi is a gimmick all they are doing is takeing the period of time between each 60 hz frame of the 60hz signal which is what im talking about and causing all pixels to turn to the color of the frame provided then to black untill the next frame comes along. it stimulates a flicker effect. only tn's will benifit from this tech becasue they are near the 1.73 ms border that we need in order to go higher than the 60 hz refresh rate. why not use this to just do what i state.

edit: i took the time to learn a little bit about bfi a little more. turns out they will do it the way i typed for tn's but for va's and ips they will strobe the backlight in order to produce a lcd with percisstance of visioin to simulate a crt. The reason they are doing this isn't to flicker everyones eyes its to take the wasted oscillating time all the way up to 16.7ms and emulate frame doubleing without any extra proccesing. instead of actually drawing the new frame during the period of time i was talking about. they use image percisstance to make your brain do it.

feel sorry for those epileptics who arn't to fond of screen flicker! lol

in doing this they can emulate 120 hz without actully changeing the 60 hz signal standard.

also the standard for crt was 85hz to help epileptics.


Standard television broadcasts at 30fps not 60, film is shot at 24fps, you are way off.

I never said ther was no benefit to 75hz, I clearly stated what the benefit is, as long as the controller supports the higher rate.

I never siad you cannot bypass the EDID signal, I don't know where you are getting that from. I clearly stated you can by way of software, and do not have to mutilate a DVI cable to do it.

You will still have ghosting anyway, because very few transitions are one shade of grey to another, and the 2ms G2G specification is as always, not completely accurate. Real response times, bieng black to white to black, are in the region of 16-24ms. Even if the 2ms were acurate, it's still higher than your magical "border" is it not?.

BenQ made a panel with a near 0ms response time, and still observed ghosting, that's why they developed BFI, look it up. Concerning BFI, I was not refering to the refresh rate signal (60hz), I was telling you that the pixels update 120 times a second. This was so you would know they are capable of making them update more than 60 times per second, but the industry sticks with 60. It's not because of film or television broadcast FPS either.
 
Have you verified (beyond "it feels better") that this increased refresh rate is actually being used?
 
vick1000:Standard television broadcasts at 30fps not 60, film is shot at 24fps, you are way off.



I did state 60 i meant 30fps, also stated film is shot at 24fps(my mistake was irreleavant to
the arguement)


vick1000: I never said ther was no benefit to 75hz, I clearly stated what the benefit is, as long as the controller supports the higher rate.


well im telling you this monitor can accept a higher rate and there is a diffrence if you don't belive me buy one are quit argueing with the fact i'm telling you and others because i can.Why in the hell would i argue on here and try to ruin what little rep i'm trying to build if it couldn't.

vick 1000: I never siad you cannot bypass the EDID signal, I don't know where you are getting that from. I clearly stated you can by way of software, and do not have to mutilate a DVI cable to do it.

"also this lock on the signal that im bypassing im glad you finally came to the conclusion that we can do this." in which i said this because you said

vick1000:all you are doing is bypassing a "lock" on the output signal of the video drivers.
I don't know what you think you are doing to your monitor, but you are not changing anything in it, software or hardware, just bypassing a software refresh rate lock. It's capabilities remain the same regardless, and again there is no need to modify hardware to bypass the EDID signal.


so you at first you said mutilating the edid pins wouldn't do anything but you also sated that your are bypassing a lock on the output signal for the video drivers. so if its not doing anything why would you say its bypassing a lock? and what is this lock there for since you know everything what purpose does it stop people from doing regardless of software are hardware.

vick1000:BenQ made a panel with a near 0ms response time, and still observed ghosting, that's why they developed BFI, look it up. Concerning BFI, I was not refering to the refresh rate signal (60hz), I was telling you that the pixels update 120 times a second. This was so you would know they are capable of making them update more than 60 times per second, but the industry sticks with 60. It's not because of film or television broadcast FPS either.


I never stated you could't have ghosting. I still have ghosting at 73 hz it's just smoother. hell crt's had ghosting. (crt's had image percisstance but looks the same as ghosting)

I wasn't referring to the refresh rate signal either. I was referring to the brief period of time between each 16.7 ms frame that the pixels after they have already finished transitioning that they are waiting stagnant produceing a ghosting effect. also even bfi is 60 hz at the panel they just appear to be produceing a faster refresh rate beacause i'll state it again the stagnant time isn't wasted on tn panels. the transitors in each pixel are brought to a certain saturation immediatly after hitting each colors saturation that would produce a black flicker. this is a problem on va and ips panels because theyre pixel response time is so high that they don't have a stagnant effect. so they strobe the led's are the ccfl lamps right at the beginning of each frame. what this does is erases persay all ghosting and replaces them with black frames. samsung does light strobeing and benq does the pixel saturation way. in my opinon the samsung will be better becasue it wont add anyother controller proccesing.

and it wasn't the monitor at 0ms that had ghosting it was at the point of retinal persistance
 
Once the technology from higher end TV sets trickles down to monitors, we should see monitors capable of 100hz and 120hz.

Before some of you throw a fit about how much you hate Sony/Samsung/Toshiba/whoever's 120hz mode and the fake frame creation, remember that those TVs are taking 50/60/24hz content and making up frames, whereas with gaming, etc, computers actually generate 120 discrete frames, so it would actually be beneficial to computer content. This is where displayport with it's ability to do 1080p@120hz may actually make itself useful.
 
honestly all i can do is tell you it i feel the diffrence as well as see it.

why doesn't somebody write some software that sends one frame white then the other black rotating. then we can adjust the rate of our lcd to see if it works.
 
Man, your reading comprehension skills suck.

I said that removing the EDID pins won't change anything in the monitor, and it won't.

What's all that about building "rep" anyway? Why should that have anything to do with the truth?

I find it almost comical, that you think you can reduce ghosting by raising the refresh rate on your LCD, when the industry has been trying all sorts of crap for years to reduce it to no avail. You even said yourself it's the brain that causes it. But 13hz is going to improve it? Right.
 
Man, your reading comprehension skills suck.

I said that removing the EDID pins won't change anything in the monitor, and it won't.

What's all that about building "rep" anyway? Why should that have anything to do with the truth?

I find it almost comical, that you think you can reduce ghosting by raising the refresh rate on your LCD, when the industry has been trying all sorts of crap for years to reduce it to no avail. You even said yourself it's the brain that causes it. But 13hz is going to improve it? Right.

Dudes you are both pissing in the wind. He in no way said he can reduce ghosting.
He is clearly not english native speaking so there is a bit of a language barrier in understanding what he has written and his understanding of what you have written.

You have both raised some excellent issues and given some good information.
Its best left there.

Thanks to you both for some pretty good stuff.
 
Have no idea why i didn't think about taht test. and yes the higher i go the smoother the block is and less pronounced.

also the scaler itself doesnt have as much proccessing on these small squares. a full screen image test would be needed
 
i agree no more arguinge i'm sorry if i offended you vick1000 you clearly know alot yourself. to each his own. I apprecaite your info
 
would this prevent my screen from bringing up a warning dialog and shutting down automatically when i bring my refresh rate too high with powerstrip?
 
Hello, I'm from future :)
I was able to push an LG W2252TQ to 75Hz at native over DVI
Technically, it's not "pushing it" as it's natively rated for 75Hz.
Anyway, I have this monitor for some time and in terms of image quality, it is better than most today (budget) monitors, even after all those years. But anyway, I'd want to know how can I push it further, behind that 77Hz wall it natively allows. (for me it's 77Hz, where it works, but at 78Hz it says out of range).
But my main question is, can I damage my monitor by running it at high refresh rates? I have read here that yes, but also people saying it won't damage it at all. Right now, I have got just to 72Hz which is limitation of my single-link DVI. Btw, I heard they can get damaged by heat, but mine isn't heating up that much, and at lower brightness it's cold.
Thanks if you reply at all :D
 
backtothefuture-88hz.png
 
Back
Top