Samsung's first Odyssey gaming monitors include a 240Hz ultra-wide

Of course it's for sitting close. 1 meter. Better adjust your in-game fov though

main-qimg-5390667bf51153de34b76ea6ccf1cae2.png
 
Of course it's for sitting close. 1 meter. Better adjust your in-game fov though

View attachment 214189

are curved monitors going to become the majority eventually? is it inevitable? i really don't like the aesthetics or the apparent distortion of the graphics from UI to everything
 
are curved monitors going to become the majority eventually? is it inevitable? i really don't like the aesthetics or the apparent distortion of the graphics from UI to everything

in TVs, where you have multiple viewers, reports are that they are declining, but where computer monitors are for single viewers, i think there is market for both to continue. i would never consider a curved TV, but i am intrigued by curved monitors. My 2408 from dell is still doing great, so never considered replacing.
 
are curved monitors going to become the majority eventually? is it inevitable? i really don't like the aesthetics or the apparent distortion of the graphics from UI to everything
A 3400R is indistinguishable from a flat even sitting at 5-7 feet away. There's nothing wrong with a subtle curve.

This new stuff from Samsung, however, is pushing the curve a bit too hard. Hopefully enough backlash and they'll get the point.
 
Are you actually a fan of these Extreme Curved monitors? They seem horrible to me, and this one is too low of resolution

"If that strikes you as overkill, there will be less extravagant options. The G7 series (below) offers 27- and 32-inch panels at a more conventional 2,560 x 1,440 resolution with the same 240Hz refresh rate, 1ms response time, 1000R curvature, FreeSync and G-Sync. It's not as bright with 'just' DisplayHDR 600 support, but we suspect many gamers won't mind. The range will also be ready in the second quarter."

View attachment 212885

https://www.engadget.com/2020/01/03/samsung-odyssey-gaming-monitors/

Almost every super-ultrawide monitor gets this comment:

"Interesting monitor, would be great if it was taller, less wide and 4K" (ie: make it like every other monitor)

1. The extra FOV is on purpose.
2. The curve is needed because VA has semi poor viewing angles

It will probably be $1500-1800 and will be a good candidate to replace 3X 27" monitors
 
are curved monitors going to become the majority eventually? is it inevitable? i really don't like the aesthetics or the apparent distortion of the graphics from UI to everything

I've had a curved asus ultrawide 21:9 34" screen for 3 or 4 years now. It's fine. It's not as sharp of a curve, I sit about 3 feet from it, and the curve is fine. Curved screens in themselves are better, generally, imho. Obviously too much curve is too much, but if you sit at the correct spot for this lcd, it will appear even wider than it is. By that I mean it will cover a larger field of view than typical. As someone else mentioned, yes you will want to up the FOV, probably to 120, it should be the number of degrees in the circle formed by the display area when you are sitting at the sweet spot. 120 is 1/3 of a complete circle. And looking more closely at the linked article, the sharp curved screen is a 49". Honestly, the curvature is probably about right, at least for the 49" lcd.

Turns out the curvature rating is the max viewing distance in millimeters. So this means 1 meter away from the 49" sharp curve, or about 3 feet. That's pretty typical viewing distance and shouldn't be uncomfortable.

Be cool to try it.

Only sucky thing that comes to mind is that most games max FOV tops out at 105 or 110 degrees.

Here's a good read on curved screens: https://www.viewsonic.com/library/entertainment/monitor-curvature-explained
 
Last edited:
You could maybe put lower fov games in windowed mode with black bars if you care to match your real fov... sucky workaround but at this distance could be a more immersive experience
 
My money is still on placebo effect, as there are just too many parallels to the Audiophile world. but lets just agree to disagree on this one.

I don't want to derail yet another thread :p

I bet Kyle and other moderators are getting tired of cleaning up threads after they go down the rabbit hole and no longer are discussing the topic at hand.

Seriously, dude? It's all placebo effect? You feel strongly that the limits of the human eye just happen to match up with the shitty LCD hardware limitations we had for a decade+? Why the hell would you randomly draw the line at 60Hz when most people notice a positive difference immediately when using a higher refresh monitor?
 
If you can't tell the difference b/w 60 hz and 120 hz, either something is wrong w/your PC or eyes.
No


I can’t tell the difference between 75hz freesync and 144hz freesync in actual gameplay.

Someone told me to try moving my mouse around in tiny fast circles on the 2d desktop to tell the difference between 75hz and 144hz and yes I could tell the difference there — but lets pause and take a moment to realize how absurd that differentiation method is.

I have 20/20 vision in one eye and a step better in the other — tested just this year. Im not mentally deficient and my PC works just fine.
 
Last edited:
I see a lot of people complaining about the curve but if the point is to make it real even at 1800R most people are never going to sit close enough to make the curve too strong. It probably just seems awkward because you are not used to it. But according to the specs you need to sit like 1.8 meters away or more for it to become a problem and that is pretty far for a desk monitor.
 
  • Like
Reactions: dgz
like this
Seriously, dude? It's all placebo effect? You feel strongly that the limits of the human eye just happen to match up with the shitty LCD hardware limitations we had for a decade+? Why the hell would you randomly draw the line at 60Hz when most people notice a positive difference immediately when using a higher refresh monitor?
I think 60hz was the standard for ~30 years because it is a tolerable level of refresh for most eyes. Its not like the standard has always been 60hz since the beginning of displays. Its more thats the level of performance that most complaints stopped, so there was no driving force to push higher.
 
I think 60hz was the standard for ~30 years because it is a tolerable level of refresh for most eyes. Its not like the standard has always been 60hz since the beginning of displays. Its more thats the level of performance that most complaints stopped, so there was no driving force to push higher.

Exactly.

There was a driving force to improve on CRT's, as 60hz caused some nasty flicker, but once we moved to LCD's that flicker went away.
 
No


I can’t tell the difference between 75hz freesync and 144hz freesync in actual gameplay.

Someone told me to try moving my mouse around in tiny fast circles on the 2d desktop to tell the difference between 75hz and 120hz and yes I could tell the differemce there — but lets pause and take a moment to realize how absurd that differentiation method is.

I have 20/20 vision in one eye and a step better in the other — tested just this year. Im not mentally deficient and my PC works just fine.

Exactly.

I can tell the difference in tests that are designed to illustrate the difference. That flying UFO test is a prime example. The difference is immediately noticeable in that.

In game on the other hand, the difference is subtle as all hell. I'm not saying there isn't one, but I am questioning if the unremarkable difference is worth the money, heat and noise.
 
For me the difference between say 60Hz and 120Hz was indeed subtle. I play mainly fps shooters and in multiplayer now that I have had 144Hz for a few months, when I had an update which reset the frame limiter back to 60 without me realising, it felt like I was running in treacle during the game and I will never go back to a lower rate. Of course, YMMV from person to person so I'm glad we have the choice. Can't really say that I noticed any more heat or noise produced from running a higher Hz monitor though....
 
I'm curious about it - would have to have it on my desk to check it out (currently using a 38" Dell curved monitor and it works great - larger with a bigger curve? not sure).

And lol about Hz stuff again. Higher the better. And most CRTs back in the day (mid-late 90ies) could do at least 60hz (70-90 was pretty common even with cheapo monitors - sometimes you could up the Hz even more if you were willing to drop down the screen resolution). 100Hz was around quite a bit (I think one of the last CRT monitors did 100Hz native but could go to 120Hz if you went to a lower res.)- and when new LCDs came out it was a massive step backwards in quality (just significantly less weight) - took a long time for LCDs to start to approach the quality of most CRT monitors of the time (heck, for well over a decade people would go out of their way to get Sony Trinitron CRT monitors that could do something like 160Hz or more? can't remember now - but man they were the dream monitor if you could afford it).
 
CRT for some is still king


...just a very overweight king
 
CRT for some is still king


...just a very overweight king



Yeah, no thanks.

CRT screens are small, usually lacking in the sharpness of a good panel, and I don't exactly miss the good old CRT tan resulting from sitting in front of one and getting bombarded with stray electrons for hours.

Let's leave Cart's where they belong, as a relic of the 90's and earlier.
 
CRT for some is still king


...just a very overweight king


So recording the video of a CRT to play back on my LCD to show me the advantages of the CRT? That is like video ads for HDR to sell you on it, but you can't see that because you arent watching on a HDR screen.
 
Could be nostalgia. Like how folks think 24p film looks better than 48p. The lower detail hides some of that artificial stuff from your view. We may be looking back on flat monitors with a similar fondness for how corners of the screen aren't facing us so they don't look quiet as good, like a bit of vignetting on old photos... really just a style thing
 
Yeah, no thanks.

CRT screens are small, usually lacking in the sharpness of a good panel, and I don't exactly miss the good old CRT tan resulting from sitting in front of one and getting bombarded with stray electrons for hours.

Let's leave Cart's where they belong, as a relic of the 90's and earlier.

Yeah, since my first computer was a C64 I have had decades of CRT computer use, and significant overlapping usage of high quality CRTs and LCD during the early LCD days, both at home and at work.

So I just don't get the recent Pollyanna view of CRTs.

CRT were a bit sharper on non native resolutions, because there was no such thing as a native resolution, there were much blurrier than a LCD at native resolution, with electron beams bent by magnets lighting phosphors in a general area.

It was a shocking and wonderful transition to LCD sharpness and perfect geometry.

That's not to say LCDs are/were without foibles. I actually joined this forum to talk displays back in 2006 after buying and being disappointed with my first LCD purchase a Dell 24 PVA screen that had about 48 ms of input lag, and a response time not much better leaving big obvious ghosting trails. I got rid of it immediately.

But today LCD benefits remain (ultra sharpness and perfect geometry), while their defects have been tempered significantly.

I can't see anything but misplaced nostalgia touting CRTs. Good fucking riddance!
 
Could be nostalgia. Like how folks think 24p film looks better than 48p. The lower detail hides some of that artificial stuff from your view. We may be looking back on flat monitors with a similar fondness for how corners of the screen aren't facing us so they don't look quiet as good, like a bit of vignetting on old photos... really just a style thing

I had a friend argue with me that 48p movies shouldn't be a thing exactly because they reveal just how fake everything is. I couldn't believe my ears. Let's kill the tech instead of improving the production. Makes total sense.

It was a very passionate conversation
 
I had a friend argue with me that 48p movies shouldn't be a thing exactly because they reveal just how fake everything is. I couldn't believe my ears. Let's kill the tech instead of improving the production. Makes total sense.

It was a very passionate conversation

hes right though, you are arguing a haiku with more syllables is better because more information. its exactly the same thing. its the same reason movies with experimental narratives dont take off mainstream. movies have rules and a look, otherwise its just video.
 
hes right though, you are arguing a haiku with more syllables is better because more information. its exactly the same thing. its the same reason movies with experimental narratives dont take off mainstream. movies have rules and a look, otherwise its just video.

Not really. It would only reveal more if movies were in constant motion, never stopping. But in reality much of the shooting is static, and if there is anything to reveal, it will be see equally then.

All a higher frame does is reduce judder/strobing, and improve motion clarity.

Being attached to low frame rate judder/strobing/blur, is a nostalgic fixation. Nothing more.
 
hes right though, you are arguing a haiku with more syllables is better because more information. its exactly the same thing. its the same reason movies with experimental narratives dont take off mainstream. movies have rules and a look, otherwise its just video.

Oh, panning at 24fps is AMAZING. It's so smooth, Mandelbrot would be proud
 
I had a friend argue with me that 48p movies shouldn't be a thing exactly because they reveal just how fake everything is. I couldn't believe my ears. Let's kill the tech instead of improving the production. Makes total sense.

It was a very passionate conversation

I kind of feel like higher framerates in film make everything look like a daytime soap opera, or a home movie shot on a camcorder

24hz is part of what gives movies their movie magic.

This is probably just because our minds have been biased by 30 to 60hz daytime TV being bad, thus fluid motion in film is associated with low quality content.

Future generations that grew up with more high refresh content won't have these same biases.
 
Last edited:
I kind of feel like higher framerates in film make everything look like a daytime soap opera, or a home movie shot on a film.

24hz is part of what gives movies their movie magic.

This is probably just because our minds have been biased by 30 to 60hz daytime TV being bad, thus fluid motion in film is associated with low quality content.

Future generations that grew up with more high refresh content won't have these same biases.

I think you are right. It doesn't "look bad" it just feels weird to me. I always turn off frame smoothing or whatever each tv calls it, off when i get them.
 
You guys watch too much TV. I Stopped doing that 20 years ago
 
it's moreso the shutter speed than the frame rate. ~1/48th a second gives that "smooth" look we associate with 24p film, but it would look about the same at 48p or ANYp, if it could be filmed at around 1/48th shutter
 
Oh, panning at 24fps is AMAZING. It's so smooth, Mandelbrot would be proud

and thats part of the language of cinema. its what audiences are used to. trade great pans(not exactly the most common shot) for what looks to us as terrible soap-opera other 95% of the movie. it is what it is, movies dont follow the same rules as video card benchmarks.
 
and thats part of the language of cinema. its what audiences are used to. trade great pans(not exactly the most common shot) for what looks to us as terrible soap-opera other 95% of the movie. it is what it is, movies dont follow the same rules as video card benchmarks.

I guess we'll see come next wave of Avatar movies (I'd hope)
 
Not really. It would only reveal more if movies were in constant motion, never stopping. But in reality much of the shooting is static, and if there is anything to reveal, it will be see equally then.

All a higher frame does is reduce judder/strobing, and improve motion clarity.

Being attached to low frame rate judder/strobing/blur, is a nostalgic fixation. Nothing more.

no. im not a cinefile at all, its just well, not enough time today, and, this-

https://en.wikipedia.org/wiki/Dunning–Kruger_effect
 
I apologize in advance for bringing this thread back on topic.

I saw the screen. I liked the image based on the highly controlled demo, and I liked the curve a lot in the context of gaming. I think the aspect ratio would be too wide to work well with most of my work.

The real reason this screen is hyper wide is because making a curved screen is easy but making a spherical one isn't. If this were 16:9, the cylindrical shape would give you a headache from the distortion.
 
24hz is part of what gives movies their movie magic.

No...

This is probably just because our minds have been biased by 30 to 60hz daytime TV being bad, thus fluid motion in film is associated with low quality content..

Again, no.

Film is 24Hz because when that became a "standard", the 1920's or so, film was VERY expensive. The output for the (mostly slow-ish moving) films was fine for the time. They weren't making action movies back then that compare to actions movies of today (if at all).

Your brain/eyes see at 30fps. CRT's had to do 60fps because at 30fps, if the moment the screen is drawing is off by a half hertz to when your brain processes the visual data, it appears to be strobing or flickering. This can be nausea inducing. At 60fps it's for the most part fast enough either way that, with persistence of vision (images do not immediately fade from what your brain sees), plus the phosphor(in the crt screen) continuing to glow for a bit of time after the electron beam passes, makes the motion appear smooth.

When you watch a 24fps movie on a cable channel, it displays at 30fps and just repeats every 4th frame. It's often obvious if there is a bit of motion going on, annoying to no end.

Movies would be superior at 60fps, action would be much better. They would have to be filmed this way though.

The Hobbit was filmed in 48fps, here is a quote from someone who saw it in the theater:
SomeHacker44 said:
I saw that movie at 48 FPS. It's the only 48 FPS movie I saw. However, it was stunning how much more - I don't know, realistic isn't quite the right word, but something - it felt to watch it. I wanted to watch all movies in (preferably non-3D) 48 FPS from then on, but it never caught on that I could tell and I haven't seen another one.
 
Your brain/eyes see at 30fps. CRT's had to do 60fps because at 30fps, if the moment the screen is drawing is off by a half hertz to when your brain processes the visual data, it appears to be strobing or flickering.

Not to mention that CRTs draw in interlaced mode so only half the screen is drawn at a time (drawing every other line from the top and then drawing every other skipped line from the top). That would mean if a CRT was at a 30 Hz refresh rate each full screen would only be displayed at 15 Hz. Using your hypothetical 30 FPS example, playing at 30 FPS on a 30 Hz refresh rate CRT would mean you'd only see HALF the FPS.
 
They can be interlaced or non-interlaced, but its a good point. Assume interlaced (the older tech) and 60fps is just about mandatory. The phosphor glow fade time helps that to work as the lines alternate being refreshed.
 
No...



Again, no.

Film is 24Hz because when that became a "standard", the 1920's or so, film was VERY expensive. The output for the (mostly slow-ish moving) films was fine for the time. They weren't making action movies back then that compare to actions movies of today (if at all).

I never argued that the "film feel" was why they did it in the first place. The original film framerate was obviously put in place due to technical and cost limitations at the time.

When you watch a 24fps movie on a cable channel, it displays at 30fps and just repeats every 4th frame. It's often obvious if there is a bit of motion going on, annoying to no end.

I can't remember the last time I watched a movie on TV. Probably 25 years ago?


Movies would be superior at 60fps, action would be much better. They would have to be filmed this way though.

The Hobbit was filmed in 48fps, here is a quote from someone who saw it in the theater:

Probably accurate, from a on objective perspective, but it still remains a fact that to many (most?) Of us, when high framerate content is used, movies look like daytime soap operas, and the effect is very undesirable.

As mentioned, it's probably just because us old farts have been conditioned to associate high framerate with low quality content, but today's YouTube generation likely won't be saddled with that same bias.

Question is if those of us with the bias can untrain it, or if it is permanently set. Probably depends on how old we are.
 
Back
Top