If you followed my past posts, you often heard terms such as “addressing fragmentation” and “solving the Android video problem”. But what does this really mean, and what is the problem?
Doesn’t Android support HLS on newer Android devices, which solves all your video playback and fragmentation issues? Not really.
The main challenge is HLS was designed by Apple and optimized for iOS devices. Even though the live latency is pretty significant, everything else from the switching logic to the playback experience is excellent on iOS. But if you are implementing HLS on other platforms, it is based on the HLS IETF draft, which still remains in informational state after many years. As with any spec, it gives a lot of room for interpretation, which unavoidably leads to different implementation qualities of HLS.
LongTail did an analysis of current issues of HLS on Android, which lists some of the current shortcomings.
Android 2.3 (Gingerbread)
- No Support, despite being the most popular version of Android
Android 3.0 (Honeycomb)
- Streams cause tablet devices to crash
Android 4.0 (Ice Cream Sandwich)
- VOD streams do not seek
- Aspect ratios are not detected and cause image deformation
- Fullscreen causes videos to restart from the beginning
Android 4.1+ (Jelly Bean)
- Aspect ratio issue is fixed, but seek is still unavailable
- Chrome does not understand HLS leading to broken mimetype detection
- Taking video fullscreen causes devices to throw an error and stop.
The original solution for video on Android used to be Flash Player, which provided video applications an abstraction layer to reach all Android devices, no matter what the underlying video capabilities were.
Even though not officially supported anymore, there is a recent tutorial how to install the archived version of Flash Player on Android – unfortunately not compatible with Chrome, only with the old Android browser.
This leads to a question – without Flash supported and not very robust native HLS video playback, how is it possible that there are so many video applications on Android?
This is were the openness of the Android platform comes in. Android might not offer a high quality out of the box HLS video stack, but it provides APIs that grant low level access to develop solutions such as Adobe Primetime Player, which includes, besides a lot of other features, its own HLS video stack – this is very different from iOS, where using the native HLS video stack is required get 3G/4G application approval.
Conclusion
Is HLS on Android really bad? Yes, if you only rely on native capabilities – but it does not mean you cannot deploy high quality HLS playback (or any other protocols), since Android provides flexible interfaces to extend and provide a high quality experience. It’s a philosophical question if this is a better or worse approach compared to iOS. Android’s openness leads to more room for issues, but on the contrary, more freedom for extensibility.
ken_asterisk Flash Playerが対応予定(つーかすでにしてる実は)なのでそれがどう情勢に影響与えるか(それとも与えねぇか)温かく見守る所存です。
おー!では今後もウォッチさせていただきます。 RT otachan: ken_asterisk Flash Playerが対応予定(つーかすでにしてる実は)なのでそれがどう情勢に影響与えるか(それとも与えねぇか)温かく見守る所存です。
“this is very different from iOS, where HLS (or progressive download) is the only path for HW decoded video.” Nope – you can directly access the hardware decoder via public APIs: http://stackoverflow.com/questions/10646657/hardware-accelerated-h-264-decoding-to-texture-overlay-or-similar-in-ios and this functionality is used by just about every app in the iOS app store which support playback of mkv files.
Thanks for sharing, I’ll forward this information.
Thanks for the correction, the APIs are available. What is challenging are Apple’s 3g/4G video approval rules that require HLS content, otherwise it will only get approval for Wifi. I corrected it in the article.
here is actually a bit more info. This method only works for media types supported by the framework, i.e. It does not work for arbitrary containers and there is no way to swap out the transport mechanism. So you couldn’t really build a streaming protocol with it. The Apple docs for AVAsset state that.
As of the release of Android 4.4 Google states, “Android 4.4 updates the platform’s HTTP Live Streaming (HLS) support to a superset of version 7 of the HLS specification (version 4 of the protocol).” Can this article be updated to included any caveats with HLS Live / OnDemand streaming on Android.
I can check on Android 4.4 specifically (and are welcoming external test results). HLS was supported in past in the API documentation, but there were issues preventing it from practical use.
Whats the state of HLS on android , post 5.1 or 6.0 release? Is it same or state has improved? Can anyone comment?