30-bit displaying with Krita?
Kai-Uwe Behrmann
ku.b at gmx.de
Thu Dec 24 12:40:09 CET 2009
Date: Wed, 23 Dec 2009 16:39:30 +0100 (CET)
From: Boudewijn Rempt <boud at valdyas.org>
> On Wed, 23 Dec 2009, LukasT.dev at gmail.com wrote:
>
>> On Wednesday 23 December 2009 16:02:47 Boudewijn Rempt wrote:
>>> On Wed, 23 Dec 2009, Kai-Uwe Behrmann wrote:
>>>> In Krita 2.0.2 on xf86_64 Linux image displaying appears to be limited to
>>>> 8-bit per channel even with OpenGL switched on. The source material
>>>> is a 16-bit/3-channels gray gradient Tiff file. But the display output
>>>> looks like 8-bit. Kritas colour picker shows fine differences on a pixel
>>>> level. So loading appears to be correct. However the Krita display showes
>>>> lesser differences. Does Krita display its data in OpenGL as
>>>> GL_RGB16/GL_UNSIGNED_SHORT?
>>>>
>>>> Any otheridea why Krita gives only 8-bit output?
>>>
>>
>> I did not know about display that has bigger channel size then 8 bit So
>> you have 30 bit per channel right?
>
> 10 bits -- those displays are beginning to become affordable, for some
> values of affordable.
... and people, who are willing to spent that money will explore the
changed quality.
Please keep in mind that motion picture rendering is done to higher than
8-bit. As well I would guess that cinema displaying is as well 10 or
12-bit[1]. So for the according people it makes sense to see that
quality already during manipulations. If Krita is intented to have an
audience in motion picture related editing, I guess it makes sense to
support the higher bit depths.
[1] http://en.wikipedia.org/wiki/Digital_cinema#Technology
> Message: 5
Date: Wed, 23 Dec 2009 18:40:34 +0300
From: Dmitry Kazakov <dimula73 at gmail.com>
> Subject: Re: 30-bit displaying with Krita?
> To: "Krita's developers and users mailing list" <kimageshop at kde.org>
> Message-ID:
> <ae32c1ef0912230740v52167aeejf27ca3954ede55e1 at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> KisPainterCanvas uses 8-bit either. Speaking truly i can't imagine the case
> where one can see any difference in colors between 16/8-bit depth on regular
> monitor. As far as i remember DVI-standard uses "up to 24-bit"
Regular monitors for image editing are moving clearly to higher bit
depths. Of course one might argue only 6-bit per component is useful while
looking at todays laptop displays. But I guess that is not what many
artists want eigther.
> representation, so 16-bit won't help anyway ;)
HDMI-1.3 can use more than 8-bit per component as well as DisplayPort. For
DVI you might be right.
10+10+10+1 RGBA == 30-bit RGB packed into 32-bit pixel. The monitor must
obviously support decoding that. These device support as well todays
typical 8+8+8+8 RGBA in 32-bit per pixel. (I dont know exactly of the
channel order.)
> Could you publish the test, please?
Create a simple gradient from black to white, say 4000 pixel wide. The
stepping is obvious on good calibrated display hardware.
> Maybe you mean "stripes" on gradients, don't you?
That is the easiest test case. So far I came not arround to watch much of
imagery as I have only one rudimentary test application. Perhaps I can
integrate that capability in Oyranos' image_display example.
> If so, the problem is a bit worse, than just "16-bit" canvas. As it might
> not help. Most of the editors, afaik, uses "dithering" during conversion
> from 16-bit to 8-bit. They add a special noise to the image to hide this
> stripes (at least Photoshop does).
This conversion to 8-bit reduces of course the colour resolution. You are
discussing a workaround not a solution to the initial request?
I wrote in the hope that supporting GL_RGB16/GL_UNSIGNED_SHORT would be a
minor task given that Krita can already use the well supported industry
standard OpenGL API.
kind regards
Kai-Uwe Behrmann
--
developing for colour management
www.behrmann.name + www.oyranos.org
PS: keeping me in CC, so I can respond earlier.
More information about the kimageshop
mailing list