Qt Multimedia status on Android

Scott Aron Bloom Scott.Bloom at onshorecs.com
Thu May 3 17:56:47 UTC 2012


Have you looked at rockplayer...

It "definitely" uses ffmpeg, ok no real proof, except it can videos that
the built in hardware based decoding cant, and it sucks down battery
when in software mode.

Scott

-----Original Message-----
From: android-qt at googlegroups.com [mailto:android-qt at googlegroups.com]
On Behalf Of BogDan Vatra
Sent: Thursday, May 03, 2012 9:07 AM
To: android-qt; Necessitas
Subject: Qt Multimedia status on Android

Hello everyone,

 For a couple of weeks me and a good friend tried to provide basic
multimedia for Qt on Android, but sadly we didn't succeed at all, so I'd
like to share with you our findings.

  Our first track (and the most important one)  was getting ffmpeg or
gstream compiled on Android and use it to encode/decode the videos, I
was misled by MoboPlayer[1] which claimed to use ffmpeg to do the job,
so everything seems to good to be true.
After a long period of time of hard working, we manage to get ffmpeg
compiled on all android supported processors (armv5, armv7 and armv7
with neon) [2], and to create a simple video player on top of ffmpeg,
then we've discovered that arm processors are not fast enough to decode
H264 frames, my HTC Desire HD phone (with a 1Ghz processor) was not able
to decode a 720p frame in less than 90ms (using amrv7+neon), 150ms using
armv7 (without
neon) and 300ms using armv5 which is not enough to have a realtime
decoding.Even worse after the decoding is finished the image still needs
to be converted from YUV420P to RGB, this operations takes ages ! On my
phone it takes from 300ms to 600ms ! So, what we've done wrong and why
MoboPlayer is so damn fast? After I recover from that shock I started to
investigate why MoboPlayer is that fast, soon I've discover that
MoboPlayer is cheating big time! They are not using ffmpeg to decode the
video ! They are using Android's media player for this job and they are
using ffmpeg ONLY to get the subtitles and to play other formats which
are not supported by Android. This discovery was a disaster for me,
because I could not believe my eyes, I begin to search for another
player which is open source and which I can check what is doing under
the hood, I found Dolphin Player [3][4], after I install it my fears
came true, the player was not able to play in realtime none of my videos
recorder with my phone ! I dig deeper intro ffmpeg sources than I found
that they can use Android stagefright[5] library to decode H264 frames,
sadly this library can not be used on all Android devices and it only
decodes to a YUV420P image, so we still need to convert it to RGB.

After I recovered from that heart attack I begin to investigate how the
hell Android is doing it that fast, and if I can cheat OpenMAX to render
into a buffer which I can read it instead to render into an Android
surface. After I check almost all OpenMAX implementation I found that it
is possible to send a SurfaceTextureClient[7]object to OpenMAX[6]. Again
it seemed to be too easy and too good to be true, the same technique
could be used also for camera preview, even it needed non-public APIs to
be used in order to get it working (by the end of this year all major
Android vendors will have upgrades to 4.x)!
Basically what I needed to do was just to implement ISurfaceTexture
interface[8] and to read the image.
The Image should contain one of these PixelFormat[9] image. Of course
nothing is that easy as it sounds, and again it was a dead end, because
Android is using *NONE* of these pixel formats [10] !
For the name of God why somebody uses an unknown (useless) pixel format
? So, I started to dig deeper and deeper, and after a long and painful
period of time, I think I found the answer:
These images uses some "special" [11] pixel formats which are hardware
vendor specific, these images can be pushed to the video card and the
video card converts them very very fast into a gl texture, practically
Android uses this textures to draw the frames onto a Surface, is similar
with this[12] test example. Now the problem is how to use this texture
in Qt? In order to display it into a Qt 4.x application we need somehow
to get the pixels, we can use it to fill a FBO and read the pixels using
glReadPixels, but glReadPixels is way too slow, or we can use it to fill
a PBO which is a little bit faster but not enough, actually is not even
close to what we need for a realtime video play !

So whats next ? Sadly I have no idea how to create a realtime video
recoder/player on Android using Qt 4.x ! So if nobody comes with another
idea, I'm going to remove multimedia plugin from alpha4 release.
It seems Qt 5.x will be able to use the texture directly [13] but I'm
not very sure !
Alpha4 is the last release were it comes with new features for Qt 4.x
after this release next releases which will target Qt 4.x will contain
only bugfixes. After we'll ship
alpha4 I'll focus on getting Qt 5 ported.


Cheers,
BogDan.


[1] https://play.google.com/store/apps/details?id=com.clov4r.android.nil
[2] https://gitorious.org/android-ffmpeg/android-ffmpeg
[3] http://code.google.com/p/dolphin-player/
[4] https://play.google.com/store/apps/details?id=com.broov.player
[5]
https://gitorious.org/android-ffmpeg/android-ffmpeg/blobs/master/libavco
dec/libstagefright.cpp
[6]
http://androidxref.com/source/xref/system/media/wilhelm/src/android/Medi
aPlayer_to_android.cpp#android_Player_setNativeWindow
[7]
http://androidxref.com/source/xref/frameworks/base/include/gui/SurfaceTe
xtureClient.h
[8]
http://androidxref.com/source/xref/frameworks/base/include/gui/ISurfaceT
exture.h
[9]
http://androidxref.com/source/xref/frameworks/base/include/ui/PixelForma
t.h#63
[10]
http://androidxref.com/source/xref/system/core/include/system/graphics.h
#43
[11]
http://androidxref.com/source/xref/system/core/include/system/graphics.h
#54
[12]
http://androidxref.com/source/xref/frameworks/base/opengl/tests/gl2_yuvt
ex/gl2_yuvtex.cpp
[13] http://qt-project.org/wiki/Qt-5-Alpha


More information about the Necessitas-devel mailing list