[digiKam-users] Can digikam take advantage from modern graphic cards
Gilles Caulier
caulier.gilles at gmail.com
Wed Dec 26 19:36:36 GMT 2018
Hi,
Typically no. RAw decoding will take advantage of CPU cores, to run
demosaicing in parallel. GPU has nothing to do here. libraw used in
background do not use CPL language here, as i know. If something must be
improved here it's in libraw core, not digiKam.
But to be hosnest, using this kind of extension will decrease portability
and make the code less easy to maintain.
We are now based on Qt5 and GUI widgets take advantage automatically of GPU
computation to render image quickly on screen. There is no need to code in
Advanced language here, as all is done in Qt5 core.
Best
Gilles Caulier
Le mar. 25 déc. 2018 à 22:54, Hans-Peter <hans-ph at web.de> a écrit :
> Hi,
>
> i currently work with on-board graphic (intel HD 530, core i5) which works
> fine for most things. However, can digikam take advantage from modern
> graphic cards? For example when decoding raw images or for the local
> contrast.
> If so, would e.g. a NVIDIA GTX 1050 Ti make a noticable difference?
>
> Maybe a FAQ :)
>
> HP
>
> --
> --------- 8< -------------
> Why taunt me? Why upbraid me? I am merely a genius, not a god.
> (Nero Wolfe)
> Meine Bilder: http://jalbum.net/a/1456383
> Berge: http://jalbum.net/de/browse/user/album/1823943
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.kde.org/pipermail/digikam-users/attachments/20181226/33bcb1c6/attachment.html>
More information about the Digikam-users
mailing list