<div dir="ltr">Hi, <div><br></div><div>Typically no. RAw decoding will take advantage of CPU cores, to run demosaicing in parallel. GPU has nothing to do here. libraw used in background do not use CPL language here, as i know. If something must be improved here it's in libraw core, not digiKam.</div><div><br></div><div>But to be hosnest, using this kind of extension will decrease portability and make the code less easy to maintain.</div><div><br></div><div>We are now based on Qt5 and GUI widgets take advantage automatically of GPU computation to render image quickly on screen. There is no need to code in Advanced language here, as all is done in Qt5 core. </div><div><br></div><div>Best</div><div><br></div><div>Gilles Caulier</div></div><br><div class="gmail_quote"><div dir="ltr">Le mar. 25 déc. 2018 à 22:54, Hans-Peter <<a href="mailto:hans-ph@web.de">hans-ph@web.de</a>> a écrit :<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hi,<br>
<br>
i currently work with on-board graphic (intel HD 530, core i5) which works<br>
fine for most things. However, can digikam take advantage from modern<br>
graphic cards? For example when decoding raw images or for the local<br>
contrast.<br>
If so, would e.g. a NVIDIA GTX 1050 Ti make a noticable difference?<br>
<br>
Maybe a FAQ :)<br>
<br>
HP<br>
<br>
-- <br>
--------- 8< -------------<br>
Why taunt me? Why upbraid me? I am merely a genius, not a god.<br>
(Nero Wolfe)<br>
Meine Bilder: <a href="http://jalbum.net/a/1456383" rel="noreferrer" target="_blank">http://jalbum.net/a/1456383</a><br>
Berge: <a href="http://jalbum.net/de/browse/user/album/1823943" rel="noreferrer" target="_blank">http://jalbum.net/de/browse/user/album/1823943</a><br>
<br>
<br>
</blockquote></div>