Samsung Odyssey: 49" 5120x1440, 240 Hz, HDR1000

kasakka

2[H]4U
Joined
Aug 25, 2008
Messages
3,356
https://news.samsung.com/us/samsung-odyssey-gaming-monitor-line-ces-2020/

https://displaysolutions.samsung.com/monitor/odyssey

odyssey_f09_pc_04.jpg


For 2020 Samsung is releasing an upgraded version of their 49" CRG9. They claim 1ms response time and 240 Hz refresh rate, with a super curved 1000R curve. CRG9 is 1800R for reference. The stand seems to be a color swapped version of the CRG9 stand and in terms of overall design the only update is the screen portion.

I have my doubts if the VA panel can keep up with 240 Hz. While Rtings.com gave the CRG9 very good response time marks, it's still a panel that handles 120 Hz just fine. Doubling the refresh rate would require even better response times. Likewise it's still a mystery how that 240 Hz is achieved. Ideally it supports DP 2.0 and HDMI 2.1 but worst case is it requires chroma subsampling or dual DP input or crap like that.

Also no indication if the HDR is any better. CRG9's local dimming is pretty rubbish and it makes for a subpar HDR experience because black levels are raised a ton.
 
240Hz at near 4K resolution... LOLS. :D

But yeah, how this thing has the bandwidth for 240Hz suggests DP 2.0/HDMI 2.1 might be incorporated.
 
240Hz at near 4K resolution... LOLS. :D

But yeah, how this thing has the bandwidth for 240Hz suggests DP 2.0/HDMI 2.1 might be incorporated.

Actually it's still 1M pixels less than 4K which gives a decent performance advantage. You won't be running it at 240 fps in anything but older games with reduced details though so that 240 Hz is largely marketing but its nice to have that headroom. Nothing prevents you from running at narrower resolutions for example in competitive shooters that often don't even support ultrawides. You could most likely do 2560x1440 at 240 fps.
 
No hardware g-sync and 49" not much smaller than 55". I'd get a 55" OLED instead, it has the same g-sync compatibility but true 4K.
 
Last edited:
No hardware g-sync and 49" not much smaller than 55". I'd get a 55" OLED instead, they have the same g-sync compatibility and true 4K.

It will be a vastly different experience. Below 55" OLED vs (flat) 49" 32:9. It's physically about double height. The lack of any curvature is going to mean it needs to be pushed further back or looking at sides of the screen becomes awkward.

49-inch-d_32x9_-vs-55-inch-16x9.png


I get that OLED has a lot of things going for it which is why I just swapped my living room TV for one but like has been mentioned in many threads before, you need some precautions and position management to get it to work for you as a desktop display. Can we please not turn this into yet another OLED vs everything else thread?

TVs also lack things like PbP mode for multiple simultaneous inputs. Which is really annoying since they would be perfect for it.

I'm interested in trying how the Samsung G9 with its very curved screen looks in person. It should bring the sides even closer making the display seem less wide. I don't feel it has any features that would make me swap my CRG9 for it though.
 
Last edited:
Like anything else I'd want to see real world reviews before I would get excited. I wish them luck of course but the unit would have to prove itself.
 
i love my OLED. my biggest pet peeve is they won't make a 43 and they likely won't have a curved one until Samsung jumps on board with OLED tech, which they may skip in favor of micro LED, who knows.

that being said....this is definitely the sexiest monitor i've ever seen.
 
That has got to be the best looking monitor I've ever seen. Love the super curve too. Doubt I could go back to LCD though!
 
I really don't like the aspect ratio, I would greatly prefer "regular" ~2.38:1.

Still, this sweetie would have to be seriously flawed for me to not buy it. The combination of size and curvature seems awesome, way ahead of the competition.
 
240Hz at near 4K resolution... LOLS. :D

But yeah, how this thing has the bandwidth for 240Hz suggests DP 2.0/HDMI 2.1 might be incorporated.

One would hope, I can't find anything written by Samsung themselves. I found two sources who claim HDMI 2.0/DP 1.4:

https://appleinsider.com/articles/20/01/08/ces-2020-best-of-monitors
Links to Samsungs own site - has no info on HDMI/DP.

https://en24.news/2020/01/samsung-l...ve-curved-49-inch-gaming-screen-in-1440p.html
Links to The Verge - but I can't find any mention of HDMI/DP.
 
This is worth it over the current version just because of the harder curve alone. Personally, as tempted as I still am, it’s too big for my (built in) desk, and I’m not switching rooms just because of one monitor.

They need to hurry up with high-refresh 38” or high-DPI 38”. This is excellent, but both those will make a bigger difference. I think my perfect (not distant future) display would be a 38” with a resolution of 7680 by 3200 and a refresh rate of 144Hz
 
Has anyone found out yet, how Samsung achieves 240Hz native resolution 10-bit color, without any form of DSC? Because it exceeds what the current 1.4 DP bandwidth can handle (Samsung already confirmed that the monitor is running dual 1.4 DP, and HDMI 2.0).

Are they using some sort of dual DP 1.4 setup to achieve the bandwidth requirements? Or do you have to turn down the colors (Chroma subsample)?
 
Has anyone found out yet, how Samsung achieves 240Hz native resolution 10-bit color, without any form of DSC? Because it exceeds what the current 1.4 DP bandwidth can handle (Samsung already confirmed that the monitor is running dual 1.4 DP, and HDMI 2.0).

Are they using some sort of dual DP 1.4 setup to achieve the bandwidth requirements? Or do you have to turn down the colors (Chroma subsample)?

We won't know until specs and manual are out. Where did you see it confirmed that it is DP 1.4 and HDMI 2.0? If that is the case then it has to be dual DP 1.4 which will most likely make Freesync @ 240 Hz impossible and would really turn this display from a nice upgrade to the CRG9 to a very lame one.

My biggest beefs with the CRG9 are the crap HDR and the lack of a one click/preset PbP toggle.
 
We won't know until specs and manual are out. Where did you see it confirmed that it is DP 1.4 and HDMI 2.0? If that is the case then it has to be dual DP 1.4 which will most likely make Freesync @ 240 Hz impossible and would really turn this display from a nice upgrade to the CRG9 to a very lame one.

My biggest beefs with the CRG9 are the crap HDR and the lack of a one click/preset PbP toggle.


It will need a radically different backlight solution for the HDR to improve, and I can't see them going the FALD route on this, as that would make it far too price prohibitive. The CRG9 has been quite a success by most accounts, so I don't think they will deviate far from its price point or market position. Yes it will be expensive, and more than the CRG9, but not massively so.
 
It will need a radically different backlight solution for the HDR to improve, and I can't see them going the FALD route on this, as that would make it far too price prohibitive. The CRG9 has been quite a success by most accounts, so I don't think they will deviate far from its price point or market position. Yes it will be expensive, and more than the CRG9, but not massively so.

Oh I do agree, but pretty much anything would be an improvement on the HDR. The 10 zones on it is just too little to even half-assedly handle it. If they could double the zones either as narrower vertical columns or by having LEDs say at the top and bottom that could be controlled separately it would be an improvement. CRG9 looks honestly awesome in SDR but it's a real shame that its HDR performance doesn't quite hack it.

EDIT: And then a game comes along that just shifts how I think of the CRG9. Started Hitman 2 and frankly it actually looks pretty great in HDR. It also can activate HDR without needing to have it enabled on the desktop whereas most games don't think HDR even exists if it is not also enabled on the desktop. I wish all games did it as seamlessly as Hitman, would make for a far better gaming experience.
 
Last edited:
I just got it confirmed, the Samsung Odyssey G9 will support DSC! No more speculation on how the monitor achieves 240hz without any drawbacks!

Source: Samsung Business USA
 
I just got it confirmed, the Samsung Odyssey G9 will support DSC! No more speculation on how the monitor achieves 240hz without any drawbacks!

Source: Samsung Business USA

Oh that is actually really cool! Did they explicitly say that DSC allows it to reach all the way to 240 Hz rather than just say 144 Hz?

Now to be fair on the CRG9 I can barely get 60-100 fps in most games with a 2080 Ti so I am not sure how useful that refresh rate is, but the biggest benefit is that if you want to use HDR it becomes a bit easier as you can just leave it at 10-bit color all the time since you will no longer have issues with having to drop to 100 Hz to activate 10-bit color.
 
Not to mention that 1000HDR on VA can be achieved only via local dimming which sucks. Also it is too wide and too narrow.

If it were a proper FALD then it would not necessarily suck. But these monitors use edge lit local dimming which basically allows Samsung to cheat certifications for themselves but which does fuckall in real life content as far as contrast range goes.
 
Oh that is actually really cool! Did they explicitly say that DSC allows it to reach all the way to 240 Hz rather than just say 144 Hz?

Now to be fair on the CRG9 I can barely get 60-100 fps in most games with a 2080 Ti so I am not sure how useful that refresh rate is, but the biggest benefit is that if you want to use HDR it becomes a bit easier as you can just leave it at 10-bit color all the time since you will no longer have issues with having to drop to 100 Hz to activate 10-bit color.

Yes, this is what they told me:
"The Samsung Odyssey G9 does indeed support DSC. (It's what makes DQHD 240Hz possible!)
 
I really wanted to order this monitor or it's predecessor as 120Hz is fine. However, the lack of KVM switching is a deal killer. Both the AOC Agon AG493UCX and LG 49WL95C-W have integrated KVM switches. For me this will be a work/game monitor, it seems a bizarre oversight to not include a KVM at this price point.
 
I really wanted to order this monitor or it's predecessor as 120Hz is fine. However, the lack of KVM switching is a deal killer. Both the AOC Agon AG493UCX and LG 49WL95C-W have integrated KVM switches. For me this will be a work/game monitor, it seems a bizarre oversight to not include a KVM at this price point.
Cant you just get an external KVM? Or use a software solution? for my use, it's fine. As the most I'll use the second input for is when I'm capturing console footage.
 
Cant you just get an external KVM? Or use a software solution? for my use, it's fine. As the most I'll use the second input for is when I'm capturing console footage.

I don't believe the picture by picture modes work if you use an external KVM switch. If you have admin privileges to your work computer you can use a software KVM, but that is inferior to hardware KVMs.
 
I just got it confirmed, the Samsung Odyssey G9 will support DSC! No more speculation on how the monitor achieves 240hz without any drawbacks!

Source: Samsung Business USA
DSC adds input latency. It's why NVIDIA didn't add it to G-SYNC Ultimate monitors.
 
DSC adds input latency. It's why NVIDIA didn't add it to G-SYNC Ultimate monitors.

Got any sources for that? Because I can't find anything relevant by googling display stream compression and input lag.
 
DSC adds input latency. It's why NVIDIA didn't add it to G-SYNC Ultimate monitors.
That’s clearly speculation and hasn’t been proven anywhere. Testing of the XG27UQ by TFTCentral showed no change to the lag with and without DSC being used
 
That’s clearly speculation and hasn’t been proven anywhere. Testing of the XG27UQ by TFTCentral showed no change to the lag with and without DSC being used
I will say that DSC, by being digital compression, must impart some processing lag.

Whether that lag is measurable, let alone noticeable, in a broad range of applications appears up for debate. It's most definitely going to have a cost associated with it too, of course, so it likely won't be used if not needed.
 
I will say that DSC, by being digital compression, must impart some processing lag.

Whether that lag is measurable, let alone noticeable, in a broad range of applications appears up for debate. It's most definitely going to have a cost associated with it too, of course, so it likely won't be used if not needed.

DSC could add lag, but any decent implementation of it will not add latency. Good hardware encoders/decoders work nearly instantaneously. Just look at DTS and Dolby audio encoders/decoders. It's going to be similar to that. Shitty hardware can add lag, but most things don't.
 
DSC could add lag, but any decent implementation of it will not add latency.
These are synonyms. I expect that any lag or latency will depend on how well the technology is implemented, and perhaps there's a spec to keep that from getting out of hand.

Good hardware encoders/decoders work nearly instantaneously. Just look at DTS and Dolby audio encoders/decoders. It's going to be similar to that.
These add noticeable lag to audio. I switched off of Dolby Digital Live for pre-HDMI audio transmission to a receiver due to this years ago. Fine for media streams, not fine for anything that involves real-time interaction, and definitely not fine for shooters or other action games.
 
These are synonyms. I expect that any lag or latency will depend on how well the technology is implemented, and perhaps there's a spec to keep that from getting out of hand.


These add noticeable lag to audio. I switched off of Dolby Digital Live for pre-HDMI audio transmission to a receiver due to this years ago. Fine for media streams, not fine for anything that involves real-time interaction, and definitely not fine for shooters or other action games.

Yes, but there are tons of things that encode/decode it that don't. Basically every game console encodes audio with no lag, and the vast majority of receivers and tvs can decode it without any lag. If you're encoding it on a PC through software it's going to be slow.
 
Yes, but there are tons of things that encode/decode it that don't. Basically every game console encodes audio with no lag, and the vast majority of receivers and tvs can decode it without any lag. If you're encoding it on a PC through software it's going to be slow.
And that's kind of the point; we don't have enough data points to reasonably state whether DSC adds non-negligible lag or not, or whether implementations may differ on that point say the way that Freesync does.
 
Cant you just get an external KVM? Or use a software solution? for my use, it's fine. As the most I'll use the second input for is when I'm capturing console footage.
Level1 has a DVI that will support 4K @ 100hz+ and starts around $300, but as burburbur mentioned you can't do side by side. I've used software based KVMs and they all have flaws that make them a poor substitute.
 
I have my doubts if the VA panel can keep up with 240 Hz. While Rtings.com gave the CRG9 very good response time marks, it's still a panel that handles 120 Hz just fine. Doubling the refresh rate would require even better response times. Likewise it's still a mystery how that 240 Hz is achieved. Ideally it supports DP 2.0 and HDMI 2.1 but worst case is it requires chroma subsampling or dual DP input or crap like that.
240Hz only means that LCD matrix receives 240 full refreshes per second which amount to having one full refresh per ~4.167ms
It does not guarantee that colors of pixels will be able to change fast enough to allow you see 240 unique frames per second. Some color transitions will be able to change quickly enough (say in less than 4ms) but as it is typical for VA some most likely will not be able to switch fast enough to even make 60fps content properly resolved...

Let's see how it looks on previous model (Samsung C49RG90):
response_3.png
33.9ms color transition from subpixel value of 0 to 50 means that any color transitions where one subpixel if completely off (which is actually quite a lot of colors when you count this up...) to state where it is somewhat activated will take as much time as would make even claiming 30Hz be barely acceptable, let alone 60Hz or even worse 120Hz... Also keep in mind that it is not even full pixel transition time but only to 90% of final value or something like that and also that switch from 0 to something like 25 will take even longer. One thing that make it less disasterous is that most of transition happen somewhat faster and color only is very slow to make the last 1/3 of the way to desired value. It still means that such VA panel will exhibit ugly dark streaks...

This alone makes me no recommend any VA panel.
There is also issue with gamma shift. Curved nature of monitor somewhat mitigates this but still the issue is there and you cannot possibly have consistent gamma across whole screen and each eye will pretty much see different colors.

As to "how they are doing it"... probably 240Hz is available only in 4:2:2 mode 8bpp, maybe even at lower color resolution (not bit depth though) and full 4:4:4/RGB 10bpp being available with reduced refresh rate.

It is probably best to wait for reviews.
I still do not recommend VA panels and I do not even recommend curved monitors. It might seem great to have surround monitor but keep in mind that games do not draw picture with correct perspective for such monitors. They all pretty much expect flat monitors. It is not even that easy to make game properly render for curved monitor with rasterization. It would be possible using post-processing to kinda correct perspective but for proper result you would need full ray-tracing with curvature of your monitor taken into account.
 
There is additional latency with DSC. I swear I had read somewhere that this was the reason it was not being used by G-SYNC Ultimate. Turns out it is actually due to a lack of supported features in the DSC implementation included in the initial DisplayPort 1.4 specification, which was all that was available when the new version of G-SYNC was being developed and tested. DisplayPort 1.4a fixed the DSC implementation, which is why we are starting to see more G-SYNC displays using DSC.
 
240Hz only means that LCD matrix receives 240 full refreshes per second which amount to having one full refresh per ~4.167ms
It does not guarantee that colors of pixels will be able to change fast enough to allow you see 240 unique frames per second. Some color transitions will be able to change quickly enough (say in less than 4ms) but as it is typical for VA some most likely will not be able to switch fast enough to even make 60fps content properly resolved...

Let's see how it looks on previous model (Samsung C49RG90):
View attachment 243646
33.9ms color transition from subpixel value of 0 to 50 means that any color transitions where one subpixel if completely off (which is actually quite a lot of colors when you count this up...) to state where it is somewhat activated will take as much time as would make even claiming 30Hz be barely acceptable, let alone 60Hz or even worse 120Hz... Also keep in mind that it is not even full pixel transition time but only to 90% of final value or something like that and also that switch from 0 to something like 25 will take even longer. One thing that make it less disasterous is that most of transition happen somewhat faster and color only is very slow to make the last 1/3 of the way to desired value. It still means that such VA panel will exhibit ugly dark streaks...

This alone makes me no recommend any VA panel.
There is also issue with gamma shift. Curved nature of monitor somewhat mitigates this but still the issue is there and you cannot possibly have consistent gamma across whole screen and each eye will pretty much see different colors.

As to "how they are doing it"... probably 240Hz is available only in 4:2:2 mode 8bpp, maybe even at lower color resolution (not bit depth though) and full 4:4:4/RGB 10bpp being available with reduced refresh rate.

It is probably best to wait for reviews.
I still do not recommend VA panels and I do not even recommend curved monitors. It might seem great to have surround monitor but keep in mind that games do not draw picture with correct perspective for such monitors. They all pretty much expect flat monitors. It is not even that easy to make game properly render for curved monitor with rasterization. It would be possible using post-processing to kinda correct perspective but for proper result you would need full ray-tracing with curvature of your monitor taken into account.

I have had the CRG9 since last September and fully know its pros and cons. I have written everything I know about it on Reddit. I have a fast ASUS PG278Q TN panel and LG C9 OLED TV to compare to here and I'm running a 2080 Ti. I don't play super fast competitive shooters so personally I am fine with the response time and I haven't found its dark transitions an issue for example during night scenes in games like RDR2 (which runs about 60 fps) or Sekiro (which runs at 100-120 fps modded for super ultrawide res). While the CRG9's response time is not as good as my other monitors, it's far better than my Samsung KS8000 TV (also VA) where I could clearly see black trailing during gaming. On the CRG9 I only notice it on the desktop on dark backgrounds like say Windows settings in dark mode when scrolling, text will become thinner looking until scrolling stops and the image stabilizes again. So definitely not a display for dark mode lovers.

Could it be better in response time? Absolutely! I would love that and I really hope the G9 delivers on this front. I'll go through the G9 manual when it is available and try to post a summary on the differences that are not found in specs alone.

To me a curve is essential on any ultrawide monitor and the CRG9 could definitely be more curved due to its width. It helps make the very edges more usable and helps avoid viewing angle issues VA might have. Curved has absolutely no issue with rendering in games and does not need to be taken into account in software at all but super ultrawide resolution does. Many games will have lengthening on the very edges of the screen because their FOV implementation is not made for something this wide. Funnily the absolute best FOV implementation on any game I have played is in Control which does not even support super ultrawide without modding it with a 3rd party patcher! No FOV distortion, just nice wide view that helps make its environments less claustrophobic. It now also works with DLSS 2.0 which is really nice so it performs like a champ.

This forum is way too often all about perfect when good is totally fine. Not everything is unfortunately the level of an OLED TV and sometimes these compromises are acceptable because you prefer the form factor and/or feature set. I have had a really good time working and gaming on the CRG9 so I expect the G9 to be even better for both.
 
Then you more or less know what to expect. Good :)

I would not expect major improvements in LCD technology as progress in it seems to have stopped as much as ten years ago. There was progress in backlight and control circuitry but generally LCD matrices are exactly the same as they were say 10 years ago.

On VA from one of nice tricks that was used before and is not anymore is pre-tilting. It works by buffering one frame and rotating sub-pixel opposite to it's desired location first and then to desired location. This especially accelerated transitions from black to darker shades. It was used on most S-PVA panels and probably some S-MVA but here I am not sure. I had Dell 2407wfp with this technology and it almost completely lacked any dark streaks I had on latter VA panels without pre-tilting. With ability to refresh whole screen at 4ms I could see it being kinda feasible to implement it assuming limit of 120fps. In best case scenario it could only add 6ms of input lag by starting pre-tilting exactly at the same time GPU sends middle line. It would finish drawing actual frame just before GPU (if it operated at 120fps) was about to drawing middle line of next frame so the whole process could just repeat. I would however not hold my breath over such feature and especially on the idea that they would implement it correctly :)
 
Back
Top