[digiKam-users] Best practice - large database

Gilles Caulier caulier.gilles at gmail.com
Tue Jan 5 17:30:58 GMT 2021


Hi and happy new year.

Le mar. 5 janv. 2021 à 18:05, Stefan Wulff <stefan.wulff at gmx.de> a écrit :
>
> Hi!
>
> I’m playing around with digikam for about three month and getting more and more familiar with it. (I've been using thumbsplus on windows +Mariadb on NAS for many years...)
>
> Actually I’m using
>
> digikam on 2 macbooks
> maria-db on raspberry pi4 running from usb3-SSD
> abount 35k pictures on Synology NAS
> 1 db for all: core, thumbs, faces
> db size is about 5 GB.
>
> Performance is ok ... but not really fast.
>
> Raspberry Pi und the most used macbook are connect with GBit-Ehternet.
>
> My first trial was maria db on Synology NAS, but it was much too slow because nearly every db access had to use HDD.
>
> So my questions are:
>
> Should I divide into separate databases for core, thumbs, faces?

With Mariadb and mysql, database files are already separated..

> Is there an option to use smaller thumbnails/preview to reduce db size?

Yes, in Setup dialog, you can switch on/off Hdpi thumbnails size.

> Other suggestions?

Always use a SSD to host at least the database (if you can do it for
collection it's also better) and never use WIFI.

The most important point is to localize the bottleneck : computer,
network, software... Prior is to check network connection, second the
harware (hard drive). For the software, outside digiKam, the method to
share files over the network is important. After many years of
experimenting with NFS in my office, I left it for SAMBA which is
better and supported well everywhere. I don't know the Apple network
file system...

PI hardware is very good for file sharing. I use it at home for
multimedia purpose and it rock with large files.

Best

Gilles Caulier


More information about the Digikam-users mailing list