Review Request: New KIO::http_post and KIO::StoredHttpPost APIs that accept a QIODevice as input...

Allan Sandfeld Jensen kde at carewolf.com
Fri Feb 4 16:13:15 GMT 2011


On Friday 04 February 2011, David Faure wrote:
> I like QIODevice actually, for reading stuff on demand.
> I use it everywhere in KArchive (KZip, KTar...) and KFilterDev.
> QIODevice is a "pull" solution - you can ask for "1MB of data now".
> Well, at least with buffers and files, not necessarily with sockets :-)
> The good thing with QIODevice is that it works on top of both memory
> (QBuffer) and files (QFile), so in kde5 we could even remove the
> QByteArray overload. And if you want to send bzip2 compressed data you
> could even insert a KFilterDev to do it "on the fly".
> 
Yes. It is nice because it is synchronous, where KIO is asynchronous. The 
problem where it falls apart is when you use multiple threads. Since Qt IO 
works by registering filehandles in the giant poll/select in the core of Qt 
scheduler you have to make all request to any QIODevice from the same thread 
that owns them or they will block indefinitely (this has bitten a million 
times!). I can not remember if they fixed it, but upto at least 4.5 you also 
could not move QIODevice between threads by reowning them, like you can with 
other QObjects. This meant you had to always access them from the same thread 
that they were created in. 

As I said. I have nothing against QIODevice, I use it all the time, I was just 
thinking async to async inside KIO would be nicer, so the programmer doesn't 
has to learn the brutal quirks of both systems.

> KIO::Job on the other hand is a push solution - it will tell you when it has 
> something. Well in this case both would work I guess, slotDataReqFromDevice 
> could just send whatever is available, using an intermediate storage for
> what the underlying get job is emitting.
> But the only benefit would be "being able to upload remote data", not sure
> if we have a use case for that.
> 
Okay, maybe the immediate usecase is a bit far-fetched, but in my mind any 
sufficienty large file should be treated as a stream, because you are only 
going to be able to process it in trunks at a time.

`Allan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.kde.org/pipermail/kde-core-devel/attachments/20110204/d2e8ee3e/attachment.htm>


More information about the kde-core-devel mailing list