[gst-devel] Re: Comparison: MAS, GStreamer, NMM

Christian Fredrik Kalager Schaller uraeus at linuxrising.org
Wed Aug 25 17:04:54 BST 2004


On Wed, 2004-08-25 at 16:40, Matthias Kretz wrote:
> On Wednesday 25 August 2004 15:58, Thomas Vander Stichele wrote:
> > Hi Marco,
> >
> > > (1) Allow the playback of an encoded audio file (e.g. MP3). This will
> > > result in similar setups: a component for reading data from a
> > > file connected to a component for decoding connected to a component
> > > for audio output. (Together, this is called "pipeline" or "flow
> > > graph").
> > > (2) Set the filename of the file to be read.
> > > (3) Manually request/setup this functionality, i.e. no automatic setup
> > > of flow graphs.
> > > (4) Include some error handling.
> >
> > I'm curious about (3) - why should it not be done automatically ?
> 
> Both should be possible. In case you don't want it to be done behind your 
> back, when you want to have a special setup...
> 
> > So 
> > you're saying you just want an application that can only play mp3's ?
> > Personally I think a much better test would be to have a helloworld that
> > can take a media file and just play it, whatever type it is.  That's
> > what users care about, anyway.
> 
> This not about writing an especially usefull app but about looking how the API 
> "feels". While I find it interesting to see how to set up the framework to 
> play a mp3 file it's of course also interesting how much the framework can do 
> on it's own.
> But... this is not really about comparing features, it's more about comparing 
> APIs.

I would think this is the least interesting thing to look at in this
setting cause if the Qt style bindings to GStreamer (or if someone made
Qt style bindings for MAS) isn't exactly what you want you can change
them to fit what you want in the coming year. In his talk at the
conference Mathias Ettrich said that in some was Qt was basically a
binding to a lot of different C libraries, I remember he mentioned
freetype as one. So basically you should be able to get a Qt binding
that is just as good and comfortable as the rest of Qt for you to
program in and with. The important things for KDE to evaluate IMHO,
especially considering that KDE 4.0 is just a year a way is features,
maturity, interoperability, viability and cross-platform support.



> > > In a second step, we would like to extend the helloworld program with
> > > following feature (helloworld II):
> > >
> > > (1) Add a listener that gets notified if the currently playing file
> > > has ended, i.e. this listener is to be triggered after the last byte
> > > was played by the audio device.
> >
> > What sort of thing is your listener ? An in-program function callback ?
> > Another process ? Something else ?
> 
> Any kind of listener. If in C you do it with a callback just show the code how 
> to do it.
> In Qt/KDE you'd just want to connect a slot to a signal.
> 
> > > In a final step, we would like to extened the helloworld program
> > > (helloworld I) to allow for distributed playback (helloworld III):
> > >
> > > (1) The component for reading data from a file should be located on the
> > > local host. The component for decoding, and playing the audio data should
> > > be located on remote host.
> > >
> > > Notice that this third example should also demonstrate how easy (or
> > > painful) it is to develop networked multimedia applications using the
> > > particular framework. We hope that this will finally show that
> > > developing distributed multimedia applications means more than "well,
> > > simply write a component for streaming data and put that into your
> > > pipeline".
> >
> > Not sure why the third is important.  While it's important for
> > multimedia frameworks to be able to do things like this, I don't see the
> > value of this for a desktop environment.  Can you provide a use case
> > where this makes sense ?
> 
> Thin clients, conferencing, cool stuff like shown in the NMM talk at 
> aKademy ;-)
Yes, but I am not so sure that in most of these cases having the media
framework on both sides is the right solution. For the thin-client
scenario for example I think having a lightweight limited functionality
media server client is the better solution and then have the media
framework on the other end do the needed transcodings and adaptions in
order for that light client to be able to play/display what you are
sending it. 

My way of solving the exact demo done at the conference would be to have
a deamon running which offered a rendevouz/zeroconf services which it
broadcast over bluetooth/wlan. When you then arrive with your handheld
which has bluetooth/wlan it would inform the owner that X number of
services was available and ask if she/he wants to transfer the stream to
any of them. I guess you don't really want to be playing from your PDA
when you get home anyway, so I guess adding some functionality which
transfered the playback to the sound collection on your central
server/machine would be the final solution to the issue I guess.

> > Also, I don't think it's smart to do this for audio only.  We all know
> > audio is the easiest to get right anyway, and audio presents a lot less
> > challenge to frameworks.
> 
> Uh, I don't agree that audio is the easier part. Video is easier. 
> Synchronization isn't easy, though.
> 
> > I'm sure I could come up with other things that are important to be
> > tested, I'll think about it some more.
> 
> Again, it's not about a testsuite or features but about the API.
And as I mentioned earlier that is probably the least interesting thing
to look at as it is the easiest thing for you to fix at this point in
time.

Christian



More information about the kde-multimedia mailing list