Blur Busters Is Testing a 480 Hz Display

But why?

Most people can't even tell the difference of a 240 hz vs 144 hz. They have to be right next to each other to see a difference and even then it's not like it's a major upgrade. It's not like going from a 60 hz to 144 hz panel.
 
I agree with Rogue. Unless this has some specific use in some specific field I don't see the point.
 
And then it turns out that there is nearly three seconds of lag between the frame being generated and the monitor's display refreshing to show it...

But why?

Most people can't even tell the difference of a 240 hz vs 144 hz. They have to be right next to each other to see a difference and even then it's not like it's a major upgrade. It's not like going from a 60 hz to 144 hz panel.

Think of all those 1950's and 1960's ideas about subliminal messaging being buried in one frame out of however many.
 
Well..i suppose i could go play quake 3 on that screen :)
 
Personally I don't notice a change in refresh rate after about 100Hz. My X34 is overclocked to 100Hz and I can't tell the difference between it and my 144Hz Dell.
 
Jeez at some of the replies here. For those that run without vsync there is less noticeable tearing the higher the refresh rate is regardless of what framerate you have.
 
But why?

Most people can't even tell the difference of a 240 hz vs 144 hz. They have to be right next to each other to see a difference and even then it's not like it's a major upgrade. It's not like going from a 60 hz to 144 hz panel.

Definitely suffering from diminishing returns at this point, but I do remember seeing the LTT video comparing 144 vs 240 and apparently their best fps player was making more shots on the 240 due to their being more frames / less scene change between redraws than 144 that actually caused it to improve his play by a noticeable amount. It was far less noticeable as a conscious description of smoothness than it was that his almost automatic reflexes could benefit from it.
 
Interesting, I have a GTX 1080 Ti along with an HP Omen 32" (2560x1440) and use nvidia Fast Sync. I never experience tearing at all but I'm only at 60hz. It boggles my mind that peeps using 240hz monitors, V-Sync OFF have better "skillz" due to their higher refresh rate. Seems to me that for competitive online play they are relying on a technological advantage, not skill. But who am I to say since they are the ones who usually hand me my ass on a silver platter. Right now I am choosing not to afford faster refresh since the monitor I want hasn't been made yet. 480hz is a bit excessive, no?
 
Everytime we upgrade technology a bunch of people come in and whine and say why we don't need it. I remember when we had arguments just like this and there were people saying humans couldn't do anything with more than 30hz...... And the console makers thank you for supporting that so they could keep turning out junk. Probably the same people who believed the industry when they said 60hz was fine one LCDs because they didn't need high refresh.

Lets look at some other angles. Lets just say we aren't sure if this is important, how about maybe it doesn't matter? At some point the technology comes so far down in price that it is a negligible increase in cost to take that extra performance and that is a good thing to push for. And sometimes on top of that the push forward drives new technology. For instance ultra high DPI displays like those found on phones are enabling VR to not look really bad. They also allowed us to start getting rid of using AA in games. These ultra high refresh monitors might enable things like nvidias lightboost where it is not supported, or something like gsync,

Another issue is some people are probably just able to make better use of the technology. Maybe a pro really can feel the difference but a soccer mom or older person cannot.

The way I feel is simple push it as far as you can and see where it goes. If you don't believe in it, well don't buy it. And eventually we will see if it sweeps the market either because people can feel a difference or because the price comes down far enough people just say, why not.

Interesting, I have a GTX 1080 Ti along with an HP Omen 32" (2560x1440) and use nvidia Fast Sync. I never experience tearing at all but I'm only at 60hz. It boggles my mind that peeps using 240hz monitors, V-Sync OFF have better "skillz" due to their higher refresh rate. Seems to me that for competitive online play they are relying on a technological advantage, not skill. But who am I to say since they are the ones who usually hand me my ass on a silver platter. Right now I am choosing not to afford faster refresh since the monitor I want hasn't been made yet. 480hz is a bit excessive, no?

There is a subtle difference in wording but it is skill, it just requires technology to unleash it. And that is a very important distinction. The player still needs to aim, still needs to do all the same actions they just now are more able to use their senses to react more quickly. It doesn't make someone more skilled because everyone is forced to play with less than optimal frame rates, it just means an arbitrary handicap has been placed on everyone reducing all their skills or forcing them to compensate with some different skill. I think its fine eventually we will get to a point where the skill field is leveled because everyone has the technology. That is what we are really shooting for. Lets take integrated graphics, it now possible to play some less demanding games on laptops, with integrated graphics at 120 or 240 fps if you turn the graphics down etc.... This is a great thing it opens up the world of gaming to a lot more people. And at some point the 480hz display may be available on many laptops. Also as the technology improves you have diminishing returns. If you played games like HLDM, Quake, or Unreal and games before that you know how painful it was to be someone using integrated graphics vs people with a discrete GPU. Same with differences in latency. Now days the gap between the average and the best is not quite as wide and certainly far more palatable this is great progress. There was a time where there was no amount of turning the graphics down that would get you 85 fps in many games with average systems.
 
quoting Cirthix, the [H] one behind this project:

4k@120Hz
1080p@240Hz
720p@300Hz
540p@480Hz
fast pwm, pwm-free, scanning, and strobing modes
28" and 39" behave identically

LTT review will be posted on the 17th. Blurbusters is ramping up testing, will have some content up before the 17th and some after. I'll be running the preorder from the 17th to the 7th, with 5 weeks of manufacturing afterwards, 2 weeks of flashing, assembly and testing, depending on order numbers, should be appearing in customers hands at the end of October. Both reviews will have preorder links.

Latest FW brings 10bit support and eliminates scanlines completely (LTT mentions lines, but their fw is a bit older).
 
Any blind studies of 120hz to 240hz, can people really tell the difference? What percentages? The old CRT days I could tell between 80 and 100 but 120 was so insignificant it was pointless - well I had to use lower resolution to get the higher refresh rates. In VR I sure can tell between 90 and 45 fps.

I would like to see a 144mhz monitor, three of them, one set at 100hz other 120hz and then 144hz, either GSync or FreeSync and let a number of well seasoned gamers determine which monitor is which - game will have to be able to keep each refresh rate at same quality/resolution. How many, if any would guess it right compared to not? Unless something like this is done, the significance or even the benefit of ever higher refresh rate monitors maybe wasted effort. Have an incentive for those who get them all right for a draw or something to take home a monitor.
 
Any blind studies of 120hz to 240hz, can people really tell the difference? What percentages? The old CRT days I could tell between 80 and 100 but 120 was so insignificant it was pointless - well I had to use lower resolution to get the higher refresh rates. In VR I sure can tell between 90 and 45 fps.

I would like to see a 144mhz monitor, three of them, one set at 100hz other 120hz and then 144hz, either GSync or FreeSync and let a number of well seasoned gamers determine which monitor is which - game will have to be able to keep each refresh rate at same quality/resolution. How many, if any would guess it right compared to not? Unless something like this is done, the significance or even the benefit of ever higher refresh rate monitors maybe wasted effort. Have an incentive for those who get them all right for a draw or something to take home a monitor.
I did my testing back then too and found that once you hit 75 Hz the benefit of going higher starts to diminish rather quickly (eye fatigue wise 100 Hz was absolutely the min with CRT though). If I'd had a 120 Hz screen I'd run it at 100 Hz (because PAL video content) and save some energy.
 
I did my testing back then too and found that once you hit 75 Hz the benefit of going higher starts to diminish rather quickly (eye fatigue wise 100 Hz was absolutely the min with CRT though). If I'd had a 120 Hz screen I'd run it at 100 Hz (because PAL video content) and save some energy.
That was pretty much my conclusion back then, I just kept it around 85hz (60zh on a CRT was terrible). I think now with Freesync and Gsync this could really throw a monkey into the mixed. I just don't see any real data or convincing tests in this so far. If people for the most part can't tell the difference between 100hz to 144hz, then anything above, except for the truly gifted maybe pointless. Then again maybe people will be able to tell the difference after some form of showing them what to look for - training - then redo the test to see if they can get some benefits from the faster refresh rates (Freesync or Gsync).
 
Any blind studies of 120hz to 240hz, can people really tell the difference?

humans can definitely perceive the difference between scanning/strobe modes and non-LMB modes. the advantages of higher refresh rates are not nearly as relevant as LMB vs non-LMB.
 
Hey if it looks smoother why not bring back the blur buster guy who used to post here.
 
humans can definitely perceive the difference between scanning/strobe modes and non-LMB modes. the advantages of higher refresh rates are not nearly as relevant as LMB vs non-LMB.
I know fighter pilots were able to effectively respond to images up to around 240hz (flashing an image), if I remember correctly, that ability is probably present but for complex scenes and games how much of a real impact does it make? Just some unanswered questions which would be very interesting to set the record straight vice endless YouTube videos talking about the same junk over and over again. Wish someone would push the envelope and dig in.
 
Still at 60hz here and im fine. As for quake 3 at 60hz just fine 1000 fps. This 144 and 240 hz is just a gimic to me human eye cannot really tell 60 144 or 240hz.
 
Meanwhile game designers add non-optional motion blur effect to their games.
 
Still at 60hz here and im fine. As for quake 3 at 60hz just fine 1000 fps. This 144 and 240 hz is just a gimic to me human eye cannot really tell 60 144 or 240hz.
Member when 24fps was all that we needed?

I wish instead of pushing 4k and 8k movies would actually go from 24 to 48 or more like 96 fps. 4K is meaningless if the picture is a blurry mess in every scene with motion.
 
Member when 24fps was all that we needed?

I wish instead of pushing 4k and 8k movies would actually go from 24 to 48 or more like 96 fps. 4K is meaningless if the picture is a blurry mess in every scene with motion.

They've tried pushing the frame rate higher and people despise it. It just looks.... wrong....

I would imagine that if it became the norm people would adjust, but it'll be an uphill battle.
 
That was pretty much my conclusion back then, I just kept it around 85hz (60zh on a CRT was terrible). I think now with Freesync and Gsync this could really throw a monkey into the mixed. I just don't see any real data or convincing tests in this so far. If people for the most part can't tell the difference between 100hz to 144hz, then anything above, except for the truly gifted maybe pointless. Then again maybe people will be able to tell the difference after some form of showing them what to look for - training - then redo the test to see if they can get some benefits from the faster refresh rates (Freesync or Gsync).
Here's the thing... If something is good enough for me then why bother with something that only gives diminishing returns?
 
They've tried pushing the frame rate higher and people despise it. It just looks.... wrong....

I would imagine that if it became the norm people would adjust, but it'll be an uphill battle.
It only looks wrong if it's done wrong. Meanwhile most sports broadcasts are already 50fps (have been for ages) they just don't know any better. 50i, but when deintarlaced properly it is for all intents and purposes 50fps.

I assume when they tried 50fps in movies they used the same old lenses with the same shutter speeds as they used for 24fps. If you have twice the frame number you need twice the shutter speed as well to achieve visually similar results.

Also many TV sets have built in picture interpolation to add intermediate frames. And I didn't hear backlash about that either.

Of course our brain is pretty good at filling in the blanks when there is motion blur. I've tried it many times on myself. When I look at a recording in motion it seems to have detail. But as soon as I pause it and go trough it frame by frame it's only a blurry mess. Our brain has some very good temporal interpolation it seems. But I'd rather have real information not something left to my imagination. The more your brain has to fill in blanks the more exhausting the viewing is.
 
Last edited:
I agree with Rogue. Unless this has some specific use in some specific field I don't see the point.
VR baby, each eye gets 240hz off a single screen. Gotta make the big stuff before miniaturization
 
You need two 240hz screens for that not one 480hz screen.
not if you're rocking el-chepo vr like samsung gear or playstation vr, although yeah vive and rift uses 2 displays right next to each other
 
not if you're rocking el-chepo vr like samsung gear or playstation vr, although yeah vive and rift uses 2 displays right next to each other
Even having two 120hz screens would be better than any one screen.
 
The panel is 4k 120hz native with higher refresh rates at lower resolutions. IMO, 120hz 4k is a lot more interesting than ridiculously high refresh rates at 720p. I want to see what it can do at 1080p.
 
i mean, i had never imagined what 144hz could look like until I upgraded and it was like night and day.. I cant even imagine 240hz or beyond, so if there's a purpose I'd be down with it..
 
Back
Top