Making of the Microgramcaster

Making of the Microgramcaster
juli 2, 2014 erik.lupander@squeed.com

Back in april the first version of the free Chromecast / Android application ’Microgramcasteric_launcher-webwas released as a squeed labs project on Google Play with source code available on github. Since then, the application have received a few important feature updates and is currently in the feature-wise state I envisioned it to be when I started out the ’Microgramcaster’ journey back in february. The time has come to blog about it!

The background and the journey

(If you’re more into the technical aspects, feel free to skip ahead to the next section.)

So, one might ask oneself – from where did the idea to do a Chromecast local/LAN media streaming app originate? I was asked to submit an abstract to the call for papers for the mobile development conference dev:mobile, based on the series of Chromecast mobile-logo1development blog posts I did in the autumn of 2013. Those blog posts were made in the context of a project I named ’Hipstacaster’ which was a little app to cast Flickr albums to the Chromecast. While I learned a great deal on the way while writing those blog posts, the Chromecast SDK was fundamentally changed when the SDK production release happened in early February 2014. As my talk submission was accepted, I chose to create something new for the conference talk rather than rewriting the Hipstacaster.

One of the original flaws of the Chromecast was its out-of-the-box inability to cast content chromecastfrom local storage and/or local area network sources such as NAS disks and Media Servers. The ever-creative developer community came up with ways to handle this using 3rd-party applications, but at the start of February the offerings were quite scarce. I actually paid a few dollars for an application claiming to support various casting use cases, but it turned out to not work that well – supposedly playable mp4 videos were reported as unplayable, no DLNA/UPnP/SMB support etc – so I decided to try to write an application as showcase for the conference that would at the least support reliable casting of local content, with UPnP/SMB-support as stretch goals.

I did the bulk of the development as early as late February, but being a 10PM-after-the-kids-have-fallen-asleep kind of project, testing, bug-fixing, setting up the receiver environment etc. took its time. When I finally pressed the ’Publish’ button in the Google Play Developer Console, it was mid-April and several great apps doing the same and more was already available in the Play Store. After fixing a few early bugs with a maintenance release, I concentrated on the dev:mobile presentation on the 22nd of may before finally pursuing my stretch goals. UPnP Media Server support was added 11th of June and SMB support along with a lot of other improvements was added the 27nd of June. The reception have certainly been less than stellar, about 500 people have installed the app totally to this date. But on the other hand – the full marketing effort has been three tweets and two forum posts in a domestic Swedish Android community, so I guess most of the installers have just found the app while browsing the Play Store. With the latest release out, I’m hoping the number of installs can increase somewhat, especially if I spend a little more time sharing the app on social networks and humbly asking friends and colleagues to share and rate.

Technical Overview

Receiver Application
So, the receiver application is what’s running on the embedded Chrome browser window on the Chromecast stick – essentially a web page which in my case consists of a HTML5 <video> element, some divs for showing overlay info, a javascript to incorporate Google Cast support and a few scripts own 20140624_232041my own to implement the messaging protocol between the sender and receiver plus controlling the video playback.

It’s not very complicated – the receiver scripts lets me handle incoming messages in a ’onMessage(event)’ manner where I do a little bit of switch…case statements to handle the different messages in my protocol such as ”PLAY_URL”,”PLAY”, ”PAUSE” and ”ROTATE”. These messages are forwarded to my own little videoplayer interfacing script which does programmatic stuff (such as play()) with the standard HTML5 video element.

I also rely heavily on the built-in lifecycle callbacks of the video element such as ’playing’, ’paused’, ’loading’ etc to update both the receiver GUI (such as show/hide the cover art overlay) and notify the sender application of events such as ’playback ended’ or ’playing’. One of the most tricky parts were keeping the Seekbar in the Android application in sync with the actual progress of the video clip on the Chromecast – one of the tricks is always sending actual progress back from the receiver to the sender with play/pause etc. messages so the seekbar could be updated accordingly.

From a DevOps point of view, the receiver application is built and deployed using a Thoughtworks Go pipeline. It’s actually a Java Web Application (.war archive) with just HTML/JS/CSS content and it’s deployed on a Tomcat web server (using the tomgoservercat maven plugin) with an Apache in front with some port-forwarding stuff in place to beef up security. It’s got its own subdomain under squeed.io which is due to the fact that Google requires receiver applications to be loaded over HTTPS – luckily squeed.io has a domain-wide certificate in place.

Android Sender Application

phone1

The first release of the Microgramcaster app only supported local media in a flat content listing, with play/pause in the actionbar.

As the Android sender application was developed in three distinct stages – local content, UPnP content and finally SMB content – I’ll divide this section into three.

Local Content

The key thing to understand is that the Chromecast shows videos through the HTML5 video element – which as far as I know – only can read videos over HTTP(s). That means local content serving from the device must include an http server somewhere – either the device must provide one on its own or local content will need to be proxied by a server on the internet. I chose the first option as uploading local content to a server on the internet and then fetching it back to the Chromecast on the same network seemed… unecessary.

The first step was determining if the Chromecast indeed could playback content from a LAN ip address. There was a controversy back in august of 2013 where Google stopped local content from being played back through a firmware update of the cast stick. I don’t know exactly how local content playback was blocked – if the Chromecast got a block for loading stuff from 192.168.. addresses or whatever. Anyway – before creating the Android app, I registered a receiver URL on the Google Cast Console, put an extremely simple index.html file in a .war archive deployed on a JBoss 7 running on my development machine – e.g. I registered something like http://192.168.1.130:8080/microgramcaster as receiver URL, which makes my development-listed Chromecast load the receiver application from my own development computer. In this simple web page, I had a video element and by using the nice support for Chromecast debugging through the standard Chrome Developer Tools, I could just add a src=”http://192.168…/sintel720p.mp4″ attribute to the video element. With this mp4-coded video clip available on a server on my local network, it worked like a charm and I definitely had asserted that the Chromecast as of February 2014 could play back video from a LAN address and my strategy for local playback was valid.

network

Using the debug tools to introspect the HTTP requests from the video element.

The local HTTP server
So – how do you incorporate a web server into an Android application? It’s actually suprisingly easy to do. After all – a web server at its core is just having a network socket open at given port that expects requests being formatted in a given way and which responds also in a given way with headers, HTTP status codes etc. However, while reinventing the wheel may be a lot of fun, it’s not necessarily very time-effective so I took the short-route of using the existing org.apache.http components. That turned out to be bit of a dead end. While early tests showed that some video clips were served correctly, most were not which after some careful inspection of the outgoing HTTP requests from the receiver application showed that the responding web server needed to support partial content responses, e.g. HTTP 206, which includes some specific response headers etc.

Being quite time-constrained, I took a peek at the friend called the Internet and quickly found the NanoHTTPD project which turned out to be an (almost) perfect match as it has 206 support out of the box, among other things. Since streaming would require the web server being available even if the Android app was put into the background, the NanoHTTPD web server is run as an Android Service.

phone1

Local media items in the ’flat’ browsing structure.

Finding and displaying local media
What good is a HTTP server, if you don’t know what to playback or where to find your playble clips? From the outset, the outstanding design goal were simplicity which in my humble opinion ruled out clip discovery from the raw file system of the device. Instead, I chose to use the Android MediaStore API which is a ContentProvider API for accessing indexed video clips from local storage through SQLite queries. Putting query results into a List of my own video clip representation, the MediaItem class, it was easy enough to build a basic application layout with a ListView of mp4 video clips and an ActionBar to provide basic interaction capabilities such as pressing the cast icon, refreshing the ListView, play/pause and rotating the video element.

Adding Casting and Receiver interaction
To be honest, the Google Cast API and MediaRouter code is 98% boilerplate which I just borrowed from some Google Cast API example. The key thing is at the end of the initialization callback cycle where you register your own messaging protocol subclass that implements your very own messaging protocol. It’s all very simple from the developer point of view – on both the sender and receiver sides you have access to a sendMessage method and an onMessage callback. Send JSON, parse received JSON, act accordingly.

Wrapping the first iteration up

Once the basics were in place – e.g. casting worked, local media discovery worked etc – it was time to add the finishing touches such as play/pause controls and the seekbar. At this stage I had chosen to keep a single context-aware play/pause button in the actionbar, while the SeekBar used to ”seek” in ongoing videos were placed at the bottom.

As an example end-to-end use case, the seek functionality may serve as a pretty good example. Basically, when a clip is about to be played, we get the total duration from the MediaItem instance and set the number of seconds as seekBar.setMax(…). We also reset the current progress to 0. The updating of the seekbar cannot start just yet however, as there usually is several seconds between when the user ”clicks” on the clip he/she wants to play, and when the actual playback starts. Starting the seekbar update thread right away will put it several seconds ahead of the actual progress! The solution is to utilize the messaging protocol and the lifecycle callbacks of the HTML5 video element. What happens when you start to play something from the Sender GUI is that the HTTP URL of the video file and some metadata is sent as a JSON message to the receiver. The receiver js code adds a new source element as child to the video element with the URL which will trigger autoplayback of the video. When initializing the receiver application, a callback for the ’playing’ lifecycle event of the video element was registered – this callback fires once the actual playback have started – after loading, buffering etc. (a bit simplified, but to make this example more clear let’s stick with this model). This callback will do some things with the receiver GUI – but we will also get the _actual_ progress from the HTML5 video element along with the total length (which comes in handy later when we are playing SMB sourced videos). The progress and total duration data is then sent back to the Sender of the messaging protocol in the form of an ’EVENT_PLAYING’. The sender can get the progress (and total duration) on the Sender-side and update the SeekBar accordingly. The same mechanism is also used when we pause the video, we then utilize the ’paused’ callback and send pack the correct progress so we can make sure the seekbar havn’t come out of sync. Getting out of sync is easier than one might think – just an unexpected second of buffering after playback have started can be enough to throw it off.

Initial bugs and the hazards of testing your own code with your own stuff

After the initial release, I got an error report stating that the app would just Force Close or appear unresponsive on first startup. Turned out that doing MediaStore requests on the UI thread was slightly hazardous – the sqlite queries themselves was not really a problem, but in case no app had ever requested the chose thumbnail format before, people having hundreds of personal video files on their devices would find the app frozen the underlying Android mechanics generated thumbnail images for all their video files. This was technically not hard to fix using asynchronous loading but is a good example of unexpected problems popping up due to insufficient testing or just not realizing that your own device for testing didn’t reflect real-world usage. I will get back to a similar example later on in the UPnP section.

Adding UPnP Media Server support

The second stretch goal was adding support for playing content from UPnP Media Servers such as XBMC, Plex, BubbleUPnP and Windows Media Server. All these servers share the capability to remotely offer content over the UPnP protocol, more precisely the ’ContentDirectory’ Service Type. UPnP stuff is IMHO quite complicated, so I chose to utilize the excellent Cling library. Just like the embedded Web Server, Cling utilizes the Android Service mechanisms to continuously run a ”UPnP service” in background which your application can utilize to discover and perform actions on so-called ’Control Points’.

UPnP device discovery dialog

UPnP device discovery dialog

In my case, I’m using the Cling Browse action to load the content of a given UPnP ContentDirectory identifier. As these Media Servers provides metadata for each media item such as HTTP url to load the media from, thumbnail URL, duration, title etc. it was quite simple to get the videos playing on the Chromecast – instead of supplying an URL to my local web server, I could just put the URL from the metadata into my ’PLAY’ command parameters and the Chromecast could play it right away, given that it was an mp4 file.

The UPnP ContentDirectory mechanisms lets you return the content of one folder (or ’container’, to be more precise) at a time. While the ContentDirectory specification specifies a ’search’ function, some quick googling revealed that Media Server support for this function was sketchy at best, so content would have to be fetched one folder at a time. One of the key choices I had to make was deciding whether to try to keep the ”flat” presentation of videos of the ”local” mode or switch to a folder-based structure. Initially, I wanted to keep my simplicity design goal by recursively scanning all UPnP folders to assemble a flat set of mp4 videos. While this worked for my never-used Windows Media Server built into Windows 7, having just a few pre-installed videos, it turned out to not be feasible when I started a Plex Server with a decent amount of indexed media on it. (Toddlers can really ruin a DVD in no time, hard drives with backups on are more child-proof…). Apart from taking an excessive amount of time, it seems at least Plex had some cyclic things in its container tree which eventually would make the entire app crash with OutOfMemory exceptions. I had to do a bit of a rewrite and introduce a folder structure based browse functionality in my ListView. I could later reuse almost all of it for the SMB support, so it was certainly a good choice anyway – even though I still suspect supporting three different browse sources in the same ArrayAdapter subclass isn’t the wisest choice I’ve ever made. This was the second example of unforseen consequences when an original decision turned out to be really bad after putting into a real-life context.

Folder based UPnP browsing

Folder based UPnP browsing

For the 1.1 release with this UPnP support, I also improved the GUI to some extent – moving the Play/Pause down besides the SeekBar, adding a Preferences page with some general info, redesigned icons and the ability to specify whether to show non-mp4 items. Oh – most of the icons are the standard actionbar icons for the Holo DarkActionBar theme downloadable from Google.

Version 1.2: Adding SMB support

Well – in all honesty – this was probably the one feature I really had wanted from the outset. I keep my DVD backups (it’s 100 % legal in Sweden, we even pay a special ’kopieringsavgift’ on mass storage devices for it) on a harddrive attached to my Cisco WiFi-router’s  USB port. I don’t like having to have a computer running to access my media or for that matter copying stuff to local storage of my phone or tablet just for casting purposes. Being able to get videos onto my living-room TV directly from the SMB share provided by the WiFi-router would really be a killer-feature for me.

SMB support wasn’t quite as straightforward to add as the UPnP support. First of all, I needed to find a good library for working with SMB – JCIFS to the rescue! I adapted a jcifs android example to make my own init and browsing class. I was a bit afraid of having to implement some NetBIOS thing or similar to discover SMB shares on the network, but in this case it was way easier than expected – after some simple inits it wasn’t harder than

Passing smb:// into the SmbFile constructor finds my workgroup!

Passing smb:// into the SmbFile constructor finds my workgroup! Well – technically it isn’t a ’folder’ but the browsing functionality doesn’t care whether it’s a workgroup, share or just a shared folder.

passing ”smb://” into the constructor of the jcifs file representation to get a listing of SMB shares on my network. Content browsing wise, it was easy enough to implement SMB browsing alongside the UPnP one with a lot of nice code reuse.

The not-so straightforward part was getting the video streams from the SMB share to the Chromecast. Why? The SMB share can’t be accessed directly by the receiver HTML5 video element which to to my knowledge only can load stuff over HTTP. My SMB share(s) have no way of serving files over HTTP, so we’ll need to put a proxy in between to serve

the SMB files. The natural choice is of course our very own embedded web server already in place. We’ll construct a special URL for SMB sources that the web server will know how to handle. If we would send http://192.168.1.100:8282/myfile.mp4 as playback URL to the Chromecast in our PLAY_URL command for local content, we would construct the URL http://192.168.1.100:8282/smb/MYSHARE/SHAREFOLDER/myfile.mp4 as the ’url’ parameter of the PLAY command sent to the Chromecast to play something from an SMB share. In the web server, we’ll then use a different code path to handle the /smb/* path. We’ll construct an SmbFile instance with the /MYSHARE…. part, open it, calculate some HTTP response headers, attach the InputStream from the SmbFile to the HTTP outputstream – and voila – we have our proxy! From the Chromecast’s point of view, it’s the device serving the stream even though it’s just being proxied from the SMB share. The downside of this is that the phone will continuously serve data which will drain the battery faster, although not at a very alarming rate at all.

I had to fix some stuff in NanoHTTPD by means of subclassing their Response class – the SmbFileInputStream is non-buffered so using available() won’t return the total length which then would result in invalid HTTP Content-Length and Content-Range headers being returned. One could instead get the file size from the SmbFile#length() method. I also had to fix problems with files larger than 2 GB (or Integer.MAX_VALUE) bytes resulting in Content-Length headers being written with totally invalid values due to Long value truncation.

Since SMB shares are just remote file system access there’s no ”media” data readily available for thumbnails, titles, author info, total length etc. which makes it much harder to do cover art, thumbnails etc. It is possible by using libraries such as isoparser to get this information from the mp4 (for example) container, but that would require reading actual file content for all remote files and then probably caching it as application data in the Android app. Something to take a look at in the future, perhaps?

Finalizing v1.2

With three different media sources – local, upnp and smb – I didn’t like the dialog used in

Side drawer

Side drawer

version 1.1 where ”Local” and ”SMB” were hard-coded as item 1&2 together with all discovered UPnP devices. This didn’t work well at all, especially since UPnP scan adds results as they come which would make items ”jump” easily leading to misclicks. To remedy this, I introduced Androids recommended (at least before Android L) side navigation drawer where the three different source types can coexist with the UPnP choice bringing up the Device dialog while the other two opens the ListView browser directly. Not at all the consistency I had desired from the start, but it feel pretty user friendly anyway and probably works as one would expect in each given context.

Version 1.2 was also beefed up with some of the typical Android ”chores” such as density and size specific layouts and images, experimental support for MKV containers and cover art displayed on receiver when applicable.

Final words

This wraps up this little development story about the development cycles of the Microgramcaster Android app for Chromecast. The presentation on dev:mobile went reasonably well and hopefully raised some more local interest in Chromecast development. If you’ve read this far, I thank you for you time and hope you’ll take a peek at the application on the Play Store. Feel free to try it out and drop a rating or comment too! Did I mention it’s 100% free and has neither in-app purchases nor ads?! 🙂

0 Kommentarer

Lämna ett svar

E-postadressen publiceras inte. Obligatoriska fält är märkta *

*

Denna webbplats använder Akismet för att minska skräppost. Lär dig hur din kommentardata bearbetas.