[Digikam-users] How to move a digikam collection
Jean-François Rabasse
jean-francois.rabasse at wanadoo.fr
Sat Jul 28 16:27:39 BST 2012
Hello Thomas,
On Sat, 28 Jul 2012, Thomas wrote:
> My DB is on my local computer in ~/Pictures/digikam4.db
>
> I have now tried from a local file system instead of NFS/NAS.
>
> I still get this error from digikam: digikam(28701)/digikam (core)
> Digikam::DatabaseCoreBackendPrivate::checkRetrySQLiteLockError:
> Detected locked database file. There is an active transaction.
> Waited but giving up now.
As for me I've never seen that kind of message. Usually databases
locking may occur upon a system crash or a DB file backup while an
application has a current pending transaction.
As far as I know, sqlite3 uses standard files locks to implement
access (the flock() system call, and flock() doesn't work very well
with NFS mounted volumes. This, under Linux; it works better under
Solaris but very few users use Solaris86).
Maybe there's a way to drop locks via some sqlite3 commands
on the file. Perhaps you should have a look at
http://www.sqlite.org/lockingv3.html
Something else that may work (at least is worth trying because it's
really simple) is to dump your database :
sqlite3 digikam4.db '.dump' > db.dump
remove your DB file (backup) :
mv digikam4.db digikam4.db.bak
rebuild an all new DB from the previous dump :
sqlite3 digikam4.db < db.dump
Probably internal DB state, locks et al., aren't exported during a SQL
dump. Shouldn't, but this has to be checked.
> Please, is there any way I can just recover all my digikam tags from
> the DB, so that I can import them into a new digikam DB?
>
> This stuff needs to be robust. It takes huge amounts of work to tag
> so many photos. We cannot let users lose important data like this.
I fully agree with the fact that tagging, documenting, geolocating
images is a bunch of work. More, I think this work is expected to have
a VERY long lifetime. (As for myself, I consider that lifetime should
be « forever », same as the images themselves.)
Given that, my *personal opinion* is that a database is of no use for
that. Database schemes evolve with applications versions and DB
engines versions and there's no chance that your SQLite3 Digikam DB of
today could still be valid in 20 years from now. And what about the
case you change your images management software some day !
For me, databases are only « work data », to allow an application
doing fast search and filtering tasks via SQL selection requests.
It's not an archival system at all and it should be expected to be
rebuilt from scratch, provided the informations are saved somewhere
under a more robust format.
And (always my personal opinion) that's why I do write metadata into
the images files, EXIF/GPSsubIFD for geolocation infos, and XMP for
other metadata. And I want to believe I'll still be able to
read/extract that, even in 10 or 20 years. I don't trust xmp sidecar
files, not standard at all between applications even today.
My bet is that at any time, any metadata reading program could allow
extracting the informations to reinject into some other system,
be it in 2015, or 2025, or 2035 ...
But this is only a personal point of view, probably this is an
interesting discussion topic : what reliable solutions to answer the
good initial question « What about my metadata in 20 years ? » :-)
Regards,
Jean-François
More information about the Digikam-users
mailing list