30-bit displaying with Krita?

Boudewijn Rempt boud at valdyas.org
Tue Dec 29 14:27:51 CET 2009


On Tuesday 29 December 2009, Adrian Page wrote:
> On 29 Dec 2009, at 12:14PM, Boudewijn Rempt wrote:
> > On Tuesday 29 December 2009, LukasT.dev at gmail.com wrote:
> >> On Tuesday 29 December 2009 09:06:20 Boudewijn Rempt wrote:
> >>> On Friday 25 December 2009, Adrian Page wrote:
> >>>> It will also be necessary to request a 10-bit visual when the OpenGL
> >>>> context is created. Without that, there's likely no benefit in storing
> >>>> the data in greater than 8 bits as it will just get converted to 8-bit
> >>>> when written to the frame buffer, unless the card opts to dither (my
> >>>> Nvidia card doesn't).
> >>>
> >>> Is that actually doable with QGlWidget? I've tried to find something in
> >>> the Qt docs, but I cannot figure it out.
> >>
> >> Creating OpenGL rendering context with Qt is described here
> >>
> >> http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Numb
> >>er= 262519 _______________________________________________
> >
> > That doesn't look too good...
> 
> I'm not sure why those examples are doing things at that low a level. Qt
>  has a QGLFormat class that you can use to set the context format before
>  it's created. That class includes options to set the red/green/blue
>  channel sizes, which is what we'd want to change. See
>  http://doc.trolltech.com/4.6/qglformat.html#setDefaultFormat.
> 
> Note that only graphics cards supporting 30-bit mode will allow you to
>  create a 30-bit context, so this will be difficult to test.

Well, we can always add it as an explicit option, with a warning, if glew 
cannot figure out whether the card supports it.

-- 
Boudewijn Rempt | http://www.valdyas.org


More information about the kimageshop mailing list