Page 2 of 2

Re: Cintiq24pro 8 bit only???

Posted: Wed 18. Sep 2019, 05:13
by mikycoud
Hi, sorry for the late reply, the wife made me take some time away from my computer (aka holidays :-)
Anyway, yep, plenty of difference between my Nec (true 10bit display) and the cintiq 24 pro.
Then again, the Nec costs more than the cintiq, and it's "only" a display.
It's really obvious on gradients (especially monochromatic ones),where unwanted, contaminating hues are noticeable on the Wacom and not on the Nec. Also, banding is present on the Wacom, though I would say that forcing 10bits in the nvidia drivers does make things better there. (Though not close to Nec smoothness).
I suspect the wacom panel to be an 8 bit only display, with some frc-like technology to emulate a proper 10 bit display.
Also, the fact that it doesn't seem to be hardware calibratable (does such a word even exist?) says a lot about the technology used.
Anyway, the fact that you can force 10bit output in your drivers settings doesn't make sense, since you're using a geforce (non quadro) device which does not support 10 bit output via opengl.
So that shouldn't even be there to start with. Unless that setting is not referring to opengl bit output, but something else.
Anyway, you shouldn't see a difference on your setup whether or not this option is set to 8bit or 10bit.

All that being said, I find Wacom communication on that specific topic really opaque. I mean, is it really that hard for Wacom to give us some details about the tech used on their expensive, aimed at professional flagship product?

Re: Cintiq24pro 8 bit only???

Posted: Tue 24. Sep 2019, 01:05
by kangum
It seems wacom is dodging this thread. Maybe it is 8 bit after all.

Re: Cintiq24pro 8 bit only???

Posted: Tue 8. Oct 2019, 14:18
by LogicBrain
kangum wrote:It seems wacom is dodging this thread. Maybe it is 8 bit after all.

Seems like it lol.



mikycoud wrote:Hi, sorry for the late reply, the wife made me take some time away from my computer (aka holidays :-)
Anyway, yep, plenty of difference between my Nec (true 10bit display) and the cintiq 24 pro.
Then again, the Nec costs more than the cintiq, and it's "only" a display.
It's really obvious on gradients (especially monochromatic ones),where unwanted, contaminating hues are noticeable on the Wacom and not on the Nec. Also, banding is present on the Wacom, though I would say that forcing 10bits in the nvidia drivers does make things better there. (Though not close to Nec smoothness).
I suspect the wacom panel to be an 8 bit only display, with some frc-like technology to emulate a proper 10 bit display.
Also, the fact that it doesn't seem to be hardware calibratable (does such a word even exist?) says a lot about the technology used.
Anyway, the fact that you can force 10bit output in your drivers settings doesn't make sense, since you're using a geforce (non quadro) device which does not support 10 bit output via opengl.
So that shouldn't even be there to start with. Unless that setting is not referring to opengl bit output, but something else.
Anyway, you shouldn't see a difference on your setup whether or not this option is set to 8bit or 10bit.

All that being said, I find Wacom communication on that specific topic really opaque. I mean, is it really that hard for Wacom to give us some details about the tech used on their expensive, aimed at professional flagship product?


I was able to tryout a Nec Love them, but the HZ is to low for me. Gotta wait till a faster one comes out. Thanks for the info!