Considering the BL3201PT to upgrade from 43X800D. Do I need Quadro for 10 bits?

LGabrielPhoto

2[H]4U
Joined
Jan 5, 2006
Messages
3,240
Hello!

So I own the Sony 43X800D but as it is a VA panel , when I am editing my photos it bothers me that depending where I move my head a little, I can see the image colors changing. Originally the idea was a big monitor for editing that I could use for gaming.
Plus looking at photos on this 43" TV makes images more exciting than on a small monitor.
Still...as I basically never game, I feel that I could use a more accurate display to work on my photo editing and video editing.

Trying to keep it under $1000 and this one seems to be a good choice. Originally I wanted to stay at least in the 40" size but it looks like that is NOT going to happen so I will see if 32" is able to do it for me.

Anyways, on Windows 10 64 bits with GTX 980ti, will I need to add a Quadro card to my system (hopefully keeping both cards) like one of the entry level 400series, in order to get 10bit at 4:4:4 4k 60Hz?


Thanks
 
I have the BenQ BL3201PT, used it on an ASUS GTX 980 Ti and now on an ASUS GTX 1080 Ti.

On both GPUs, it has been possible to go to NVidia's Display settings and set up 10bpc output depth. See http://imgur.com/a/wQheo . (options there are 8bpc and 10bpc) However I cannot say I would see any difference at all in the output on this display between those two modes. I am enthusiast hobbyist photographer, using Sony A7R II, Capture One Pro 10, Photomatix Pro 5 and Corel PaintShop Pro X8, and in those applications at least I am not able to distinguish between 8bpc mode and 10bpc mode when editing raw photos from the camera. https://pcmonitors.info/reviews/benq-bl3201pt-bl3201ph/ suggests it is a 8-bit+FRC panel, though I do not know enough to say if how different that is to only 8-bit, or real 10bit panels.

I have a good impression about BenQ. My purchase had a 3 year warranty, and after having had the display for 2 years, it started making a buzzing noise. BenQ RMA picked it up with FedEx, no questions, and got a new unit back in a couple of days after. Curiously, the RMA email said that they'd possibly replace it with a BenQ PD3200U model, my impression being that BL3201PT model has now been discontinued, though when I got the display back from warranty, it had been switched to another BL3201PT (different serial number on the back, so the original device was not repaired).

The separate control puck is great, I love not having to reach for buttons under or behind the display.

http://www.tomshardware.com/reviews/benq-pd3200u-32-inch-uhd-monitor,4983.html has a review about the newer BenQ PD3200U. Skimming that article, it looks like PD3200U does replace the BL3201PT. In Tom's Hardware's test, PD3200U shows improved black levels, and hence contrast ratio, though was criticized for its nonuniformity. http://www.benq.com/product/monitor/pd3200u suggests the PD3200U also has a KVM switch, which the BL3201PT does not have. Also PD3200U has HDMI 2.0 (http://www.benq.com/product/monitor/pd3200u/specifications/), unlike BL3201PT, which only has HDMI 1.4. As a result, when using HDMI cable on BL3201PT, it is only possible to do 3840x2160x30Hz, and one needs DisplayPort or miniDisplayPort connection to reach the 3840x2160x60Hz.

If there is no price difference, I would get the PD3200U over BL3201PT. I don't know if PD3200U is also 8-bit+FRC or real 10-bit, though I would not purchase either on the basis of getting 10-bit display - rather, I'd get a HDR-enabled display for deeper color spaces, for example the BenQ SW320, or likely the Dell UP2718Q.
 
You will also benefit from the higher pixel density of the 32" vs the 40". Things will look sharper and less "blocky".. Things will also look more lifelike as the pixels are smaller. Something else to take into consideration. You will be happy with the 32" I wouldn't want anything bigger. 32" is already massive for a monitor. I have the HP Spectre 32 which is the glossy version of the BL3201PT (Same exact panel)
 
I have the BenQ BL3201PT, used it on an ASUS GTX 980 Ti and now on an ASUS GTX 1080 Ti.

On both GPUs, it has been possible to go to NVidia's Display settings and set up 10bpc output depth. See http://imgur.com/a/wQheo . (options there are 8bpc and 10bpc) However I cannot say I would see any difference at all in the output on this display between those two modes. I am enthusiast hobbyist photographer, using Sony A7R II, Capture One Pro 10, Photomatix Pro 5 and Corel PaintShop Pro X8, and in those applications at least I am not able to distinguish between 8bpc mode and 10bpc mode when editing raw photos from the camera. https://pcmonitors.info/reviews/benq-bl3201pt-bl3201ph/ suggests it is a 8-bit+FRC panel, though I do not know enough to say if how different that is to only 8-bit, or real 10bit panels.

I have a good impression about BenQ. My purchase had a 3 year warranty, and after having had the display for 2 years, it started making a buzzing noise. BenQ RMA picked it up with FedEx, no questions, and got a new unit back in a couple of days after. Curiously, the RMA email said that they'd possibly replace it with a BenQ PD3200U model, my impression being that BL3201PT model has now been discontinued, though when I got the display back from warranty, it had been switched to another BL3201PT (different serial number on the back, so the original device was not repaired).

The separate control puck is great, I love not having to reach for buttons under or behind the display.

http://www.tomshardware.com/reviews/benq-pd3200u-32-inch-uhd-monitor,4983.html has a review about the newer BenQ PD3200U. Skimming that article, it looks like PD3200U does replace the BL3201PT. In Tom's Hardware's test, PD3200U shows improved black levels, and hence contrast ratio, though was criticized for its nonuniformity. http://www.benq.com/product/monitor/pd3200u suggests the PD3200U also has a KVM switch, which the BL3201PT does not have. Also PD3200U has HDMI 2.0 (http://www.benq.com/product/monitor/pd3200u/specifications/), unlike BL3201PT, which only has HDMI 1.4. As a result, when using HDMI cable on BL3201PT, it is only possible to do 3840x2160x30Hz, and one needs DisplayPort or miniDisplayPort connection to reach the 3840x2160x60Hz.

If there is no price difference, I would get the PD3200U over BL3201PT. I don't know if PD3200U is also 8-bit+FRC or real 10-bit, though I would not purchase either on the basis of getting 10-bit display - rather, I'd get a HDR-enabled display for deeper color spaces, for example the BenQ SW320, or likely the Dell UP2718Q.

Interesting that with my current tV the options are 8bcp and 12bcp instead of 10. I dont even think this tv is a real 10 bit. Thanks for the headsup about the newer model as well. Maybe I will wait and get the SW320 or wait even longer for a 40" , I just love the size lol
With the i1Dispaly Pro my Sony is looking much better and very close to my smaller monitors so I think I can hold for now.
Thanks!
 
Yes, you will need FirePro or Quadro in order to see and edit 10 bpc images in Photoshop. The mainstream gaming cards do not support OpenGL buffers.
 
Yes, you will need FirePro or Quadro in order to see and edit 10 bpc images in Photoshop. The mainstream gaming cards do not support OpenGL buffers.
Yeah that is what I have been reading! Just trying to figure out that my Quadro is used only for Photoshop and similar then the 980ti for everything else IF I decide to get a Quadro and a new monitor that is.
 
Yes, you will need FirePro or Quadro in order to see and edit 10 bpc images in Photoshop. The mainstream gaming cards do not support OpenGL buffers.

That got me curious.

Implemented a Direct3D11 test application to try out 10bpc display mode. With DXGI_FORMAT_R10G10B10A2_UNORM (https://msdn.microsoft.com/fi-fi/library/windows/desktop/mt427455(v=vs.85).aspx), had no trouble creating an exclusive fullscreen mode application that renders a test gradient in 10bpc mode. The result looks like this:
- grayscale gradient: http://imgur.com/a/XweB4
- green gradient: http://imgur.com/a/Yn1y7

The top edge of the gradient is a 8bpc version, and the bottom side is a 10bpc gradient. Viewing the same application in 8bpc mode looks like this:
- grayscale: http://imgur.com/a/49yfT
- green: http://imgur.com/a/bJgJE

Trying to get the same working in a windowed mode Direct3D 11 application seems to be impossible, and even if NVidia Display Panel has been set to 10bpc mode, the result looks 8bpc.

Then, implementing the same under OpenGL 4.5, following http://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf, it looks like WGL is not advertising 10bpc pixel formats at all, neither with wglGetPixelFormatAttribivARB() query or when attempting to explicitly choose via wglChoosePixelFormatARB(). There is however a R16G16B16A16_FLOAT mode that is advertised in WGL and it succeeds to initialize, however it displays banded at 8bpc. In that light, it does look like only exclusive fullscreen Direct3D applications get 10bpc, and I am surprised to notice the difference to 8bpc so clearly. Although that might be because I have never before actually been able to run the display at 10bpc before.
 
Ah, reading here seems to confirm the same: https://devtalk.nvidia.com/default/...not-output-10-bit-color/post/5050337/#5050337 , NVidia deliberately chooses to only support fullscreen Direct3D applications with 10bpc, likely because PhotoShop and others use windowed mode OpenGL. :| Seems like AMD needs to pull a Ryzen on NVidia as well - although I'd presume with the advent of HDR, 10bpc will become mainstream as well, so that windowed mode applications will get that support, and NVidia will be forced to follow and open up 10bpc in OpenGL as well, or at least in exclusive fullscreen OpenGL.
 
To make things worst, it seems I cannot use both my GTX 980ti and a Quadro so I have to choose which way to go...guess at the end of the day displaying it in 10bits is not that crucial but still kind of sucks lol
 
Well, people build fast workstations to use exclusively with specific software, maybe it`s time to do this. The graphics card must have DP output, but there are really cheap FirePro models which will do the trick.
 
Back
Top