No subject

Kai-Uwe Behrmann ku.b at gmx.de
Wed Dec 30 15:22:31 CET 2009


> Message: 1
> Date: Tue, 29 Dec 2009 12:46:17 +0100
> From: "LukasT.dev at gmail.com" <lukast.dev at gmail.com>
> Subject: Re: 30-bit displaying with Krita?
> To: "Krita's developers and users mailing list" <kimageshop at kde.org>
> Message-ID: <200912291246.17523.LukasT.dev at gmail.com>
> Content-Type: Text/Plain;  charset="iso-8859-1"
>
> On Tuesday 29 December 2009 09:06:20 Boudewijn Rempt wrote:
>> On Friday 25 December 2009, Adrian Page wrote:
>>> It will also be necessary to request a 10-bit visual when the OpenGL
>>>  context is created. Without that, there's likely no benefit in storing
>>> the data in greater than 8 bits as it will just get converted to 8-bit
>>> when written to the frame buffer, unless the card opts to dither (my
>>> Nvidia card doesn't).
>>
>> Is that actually doable with QGlWidget? I've tried to find something in the
>>  Qt docs, but I cannot figure it out.
>>
>
> Creating OpenGL rendering context with Qt is described here
>
> http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=262519

The explicite choosing of visual attributes seems problematic to me. Its 
better to select a XVisual that is already figured out by the system, e.g. 
Qt. and stick with that. Other than in the provided examples from the 
link above, Krita will assumedly not need to use or check for alpha 
support.


> Message: 3
> Date: Tue, 29 Dec 2009 13:01:27 +0000
> From: Adrian Page <adrian at pagenet.plus.com>
> Subject: Re: 30-bit displaying with Krita?
> To: Krita's developers and users mailing list <kimageshop at kde.org>
> Message-ID: <80ACB022-4C6F-44CD-A5E0-95C31350B5CD at pagenet.plus.com>
> Content-Type: text/plain; charset=us-ascii
>
>
> On 29 Dec 2009, at 12:14PM, Boudewijn Rempt wrote:
>
>> On Tuesday 29 December 2009, LukasT.dev at gmail.com wrote:
>>> On Tuesday 29 December 2009 09:06:20 Boudewijn Rempt wrote:
>>>> On Friday 25 December 2009, Adrian Page wrote:
>>>>> It will also be necessary to request a 10-bit visual when the OpenGL
>>>>> context is created. Without that, there's likely no benefit in storing
>>>>> the data in greater than 8 bits as it will just get converted to 8-bit
>>>>> when written to the frame buffer, unless the card opts to dither (my
>>>>> Nvidia card doesn't).
>>>>
>>>> Is that actually doable with QGlWidget? I've tried to find something in
>>>> the Qt docs, but I cannot figure it out.
>>>
>>> Creating OpenGL rendering context with Qt is described here
>>>
>>> http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=
>>> 262519 _______________________________________________
>>
>> That doesn't look too good...
>
> I'm not sure why those examples are doing things at that low a level. Qt has a QGLFormat class that you can use to set the context format before it's created. That class includes options to set the red/green/blue channel sizes, which is what we'd want to change. See http://doc.trolltech.com/4.6/qglformat.html#setDefaultFormat.

Just avoid
http://doc.trolltech.com/4.6/qglformat.html#alphaBufferSize
and let Qt figure out how to initialise the context. This works pretty 
flawless with the applications I tested on top of FLTK.

> Note that only graphics cards supporting 30-bit mode will allow you to create a 30-bit context, so this will be difficult to test.

If you can point me to a small example, I would be glad to test that for 
you.

> Adrian

kind regards
Kai-Uwe Behrmann
-- 
developing for colour management 
www.behrmann.name + www.oyranos.org



More information about the kimageshop mailing list