[Kde-pim] Problem with bulk fetching of items with 4.8

Kevin Krammer krammer at kde.org
Mon Apr 23 22:19:47 BST 2012


God, switching the status filter to "todo" is disheartening.
How could I have not replied to this in almost a month?


On Monday, 2012-04-02, Shaheed Haque wrote:
> Can anybody summarise, or point to a summary of, the parallelism introduced
> by Akonadi Jobs versus
> 
>     QMetaObject::invokeMethod(this, "myHandler", Qt::QueuedConnection);
> 
> I'm trying to work out what locking if any I need between these two types
> of "parallel" activity. I'm assuming that Akonadi jobs are effectively
> serialised, so no locking is needed between say, a retrieveItem() and a
> scheduleCustomTask()?

None of the two methods introduces parallelism in the sense of concurrency, 
i.e. all access to data will happen from the same thread.

Both methods mentioned by you are basically two usages of the same thing: 
delaying call to a method by sending yourself an event that triggers the 
method call when it is processed by Qt's event system.

In other words the execution of an Akonadi job is triggered by the very same 
invokeMethod facility.

It is a bit like a reminder, i.e. your code scheduling some hint on what it 
should do later on when it is more appropriate.

The trigger events themselves are serialized, i.e. if you call invokeMethod 
(or create two Akonadi jobs) after each other, the frist one will be called 
first,
However, if the currently executing one creates another delayed invocation or 
advances event processing in some other way (e.g. calling 
QCoreApplication::processEvents), then the event triggering the second will be 
processed before anything the current job or method is scheduling for delayed 
execution.
 
scheduleCustomTask() is conceptionally similar in that it does not create any 
locking issues (i.e. is also executed by the main thread), but it is different 
in where it is being queued.
Task processing is advanced whenever a task is either completed or cancelled 
and no amount of event processing will advance the task queue beyond the 
current one.

Cheers,
Kevin

> 
> 2012/2/10 Kevin Krammer <kevin.krammer at gmx.at>
> 
> > Hi Shaheed,
> > 
> > On Saturday, 2012-02-04, Shaheed Haque wrote:
> > > Hi Kevin,
> > > 
> > > See below...
> > > 
> > > 2012/2/4 Kevin Krammer <kevin.krammer at gmx.at>:
> > > > Hi Shaheed,
> > > > 
> > > > I am not entirely sure what problem you are trying to solve.
> > > > 
> > > > My interpretation is that Exchange does not allow you to query for
> > 
> > items
> > 
> > > > that have been created/modified/deleted since the last sync.
> > > 
> > > Correct.
> > > 
> > > > Since that would mean there is no such thing as the concept "remote
> > > > revision", why would you want to store one?
> > > > 
> > > > Without the option of just getting the updates you will have two set
> > > > of items, one from Akonadi and one from Exchange.
> > > > 
> > > > Any item in E but not in A is new.
> > > > Any item in E and in A is potentially modified.
> > > > Any item not in E but in A has been deleted
> > > 
> > > Also correct.
> > > 
> > > > But as I said I think I don't understands the problem.
> > > 
> > > The piece you are missing is the amount of data, and the speed with
> > > which it can be fetched. On my system, I can fetch about 500 items
> > > every 50 seconds, and there are about 500k items to fetch, so a full
> > > download takes ~50k seconds or about 14 hours. Both the number of
> > > items, and the download time mean that I cannot realistically do the
> > > usual thing of building two lists for E and A, and subtracting one
> > > from the other.
> > 
> > I see, that indeed changes things.
> > 
> > > Instead, I have this design in mind...
> > > 
> > > 1. When I start fetching the collection, I will note the starting time
> > > using a collection attribute to persist the information (in case of
> > > needing to restart the machine).
> > 
> > You could alternatively use Collection::remoteRevision
> > 
> > > 2. I have an incremental fetch phase during which I fetch data in
> > > batches (of 500 items). After each batch, I "bookmark" where I got to.
> > > If I shutdown my machine, on restart, I resume the fetch using the
> > > bookmark.
> > > 
> > > 3. When I get to the end (I've never actually managed to get to that
> > > point yet!), I hope to delete all the items with a creation date prior
> > > to the recorded start time.
> > > 
> > > I hope that make sense? Anyway, it is the query for this last part
> > > that I am stuck on - or some other better idea!
> > 
> > The only alternative I can come up with is updating each processed
> > Akonadi item's remote revision so that all non updated once remain at
> > the old one.
> > 
> > I.e.
> > 
> > foreach( item from exchange ) {
> > 
> >  fetch akonadi item
> >  depending on result either create/modify
> >  
> >    use new timestamp as item remote revision
> > 
> > }
> > once full sync is done, fetch all akonadi items again and delete all
> > which still have the original timestamp.
> > 
> > Cheers,
> > Kevin
> > 
> > --
> > Kevin Krammer, KDE developer, xdg-utils developer
> > KDE user support, developer mentoring
> > 
> > _______________________________________________
> > KDE PIM mailing list kde-pim at kde.org
> > https://mail.kde.org/mailman/listinfo/kde-pim
> > KDE PIM home page at http://pim.kde.org/
> 
> _______________________________________________
> KDE PIM mailing list kde-pim at kde.org
> https://mail.kde.org/mailman/listinfo/kde-pim
> KDE PIM home page at http://pim.kde.org/

-- 
Kevin Krammer, KDE developer, xdg-utils developer
KDE user support, developer mentoring
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 190 bytes
Desc: This is a digitally signed message part.
URL: <http://mail.kde.org/pipermail/kde-pim/attachments/20120423/c30dd31c/attachment.sig>
-------------- next part --------------
_______________________________________________
KDE PIM mailing list kde-pim at kde.org
https://mail.kde.org/mailman/listinfo/kde-pim
KDE PIM home page at http://pim.kde.org/


More information about the kde-pim mailing list