DataEngines and models

Marco Martin notmart at gmail.com
Tue Dec 24 16:34:46 UTC 2013


Hi all,
I decided to give a crack to an old pet peeve of mine (and quite a sore point, 
if we look around in dataengines): integration of dataengines and normal 
QAbstractItemModels.
Now, one could ask why not doing everything as imports of either qobjects with 
an api or models.. fair enough and there are some reasons:

* we have invested a lot in a big number of dataengines already, their design 
may be better, but ion most case precisely because they lacked a way to have 
multi line models
* the design of the dataengines is still better for anything that has the data 
represented with a single item (system monitor, date, battery...) why?
* because of its completely async nature, it's too easy exporting a qobject 
with slow, sync methods
* data asked by multiple consumers is shared in the same DataEngine and 
DataContainer instances
* services are neat for things async in nature like authentication on a 
website or so, where blocking calls is a big no (and reimplementing a similar 
job or future api every time in every import is meh)

So, in many cases i think that having imported models is just fine, but in 
other cases would be nice to have dataengines and normal QAbstractItemModels 
merged in one..
so:

plasma-framework branch mart/modelsInDataEngine

it adds a property in DataContainer, setModel()/model() that is just a 
QAbstractItemModel * so implementation is completely up to the dataengine 
developer and is not limited in any way.
As in the example plasma-framework/examples/dataengines/sourcesOnRequest, when 
the user connects to a source, in the sourceRequestEvent of the dataengine 
implementation, a model can be set on the datacontainer.

at that point, if the source is connected with a 0 interval, it's up to the 
model to update itself on demand, typical use case for local data.

in the same example, it shows something that can be useful for remote data, 
when you need polling:
in updateSourceEvent you can still access the model, so if it's like 
downloading rss feeds,twitter updates or whatever, you can just call there 
containerForSource(source)->model()->update() or whatever way you tell to your 
model to refresh. the normal data for the datacontainer is still available so 
one can put some global data in it like number of unread emails, total rss 
items and so forth.

for all the "write" actions, services can still be used in conjunction, so the 
service of the same source of the model could do things like authenticating.

another particular thing is that there will be a single model instance 
regardless how many plasmoids are there using it, as any DataContainer is 
refcounted so it will be deleted only when the last plasmoid using it 
disconnects.

on the QML bindings there is a "models" property indexed by source just as 
"data", an example can be found in plasma-
framework/examples/applets/dataenginemodel/
it uses a stupid QStandardItemModel, that will be probably enough for a number 
of our dataengines.

Opinions? comments?

If this can be merged, I would like to start tasks to port most of our engines 
that have data that is behaving as "models" (like rss or notifications) to 
this "actual model" approach, and leave as normal DataEngine::Data only the 
stuff that is supposed to be a single, simple map (most of dataengines, 
luckily)

After this is done, even if is pretty invasive it would be good to even remove 
the class DataModel in the plasmacore import (or at least dramatically 
simplifying it to allow only the sources as items code path)

Cheers,
Marco Martin


More information about the Plasma-devel mailing list