Today Ohloh finished importing the Launchpad source code and produced the first source code analysis report. There seems to be something fishy about the reported line counts (e.g. -3,291 lines of SQL), but the commit counts and contributor list look about right. If you’re interested in what sort of effort goes into producing an application like Launchpad, then it is worth a look.
When looking at various UPnP media servers, one of the features I wanted was the ability to play back my music collection through my PlayStation 3. The complicating factor is that most of my collection is encoded in Vorbis format, which is not yet supported by the PS3 (at this point, it doesn’t seem likely that it ever will).
Both MediaTomb and Rygel could handle this to an extent, transcoding the audio to raw LPCM data to send over the network. This doesn’t require much CPU power on the server side, and only requires 1.4 Mbit/s of bandwidth, which is manageable on most home networks. Unfortunately the only playback controls enabled in this mode are play and stop: if you want to pause, fast forward or rewind then you’re out of luck.
Given that Rygel has a fairly simple code base, I thought I’d have a go at fixing this. The first solution I tried was the one I’ve mentioned a few times before: with uncompressed PCM data file offsets can be easily converted to sample numbers, so if the source format allows time based seeking, we can easily satisfy byte range requests.
I got a basic implementation of this working, but it was a little bit jumpy and not as stable as I’d like. Before fully debugging it, I started looking at the mysterious DLNA options I’d copied over to get things working. One of those was the “DLNA operation”, which was set to “range” mode. Looking at the GUPnP header files, I noticed there was another value named “timeseek”. When I picked this option, the HTTP requests from the PS3 changed:
GET /... HTTP/1.1 Host: ... User-Agent: PLAYSTATION 3 Connection: Keep-Alive Accept-Encoding: identity TimeSeekRange.dlna.org: npt=0.00- transferMode.dlna.org: Streaming
The pause, rewind and fast forward controls were now active, although only the pause control actually worked properly. After fast forwarding or rewinding, the PS3 would issue another HTTP request with the TimeSeekRange.dlna.org header specifying the new offset, but the playback position would reset to the start of the track when the operation completed. After a little more experimentation, I found that the playback position didn’t reset if I included TimeSeekRange.dlna.org in the response headers. Of course, I was still sending back the beginning of the track at this point but the PS3 acted as though it was playing from the new point in the song.
It wasn’t much more work to update the GStreamer calls to seek to the requested offset before playback and things worked pretty much as well as for non-transcoded files. And since this solution didn’t involve byte offsets, it also worked for Rygel’s other transcoders. It even worked to an extent with video files, but the delay before playback was a bit too high to make it usable — fixing that would probably require caching the GStreamer pipeline between HTTP requests.
Thoughts on DLNA
While it can be fun to reverse engineer things like this, it was a bit annoying to only be able to find out about the feature by reading header files written by people with access to the specification. I can understand having interoperability and certification requirements to use the DLNA logo, but that does not require that the specifications be private.
As well as keeping the specification private, it feels like some aspects have been intentionally obfuscated, using bit fields represented in both binary and hexadecimal string representations inside the resource’s protocol info. This might seem reasonable if it was designed for easy parsing, but you need to go through two levels of XML processing (the SOAP envelope and then the DIDL payload) to get to these flags. Furthermore, the attributes inherited from the UPnP MediaServer specifications are all human readable so it doesn’t seem like an arbitrary choice.
On the bright side, I suppose we’re lucky they didn’t use cryptographic signatures to lock things down like Apple has with some of their protocols and file formats.
One of the features of Rygel that I found most interesting was the external media server support. It looked like an easy way to publish information on the network without implementing a full UPnP/DLNA media server (i.e. handling the UPnP multicast traffic, transcoding to a format that the remote system can handle, etc).
As a small test, I put together a server that exposes the ABC‘s iView service to UPnP media renderers. The result is a bit rough around the edges, but the basic functionality works. The source can be grabbed using Bazaar:
bzr branch lp:~jamesh/+junk/rygel-iview
It needs Python, Twisted, the Python bindings for D-Bus and rtmpdump to run. The program exports the guide via D-Bus, and uses rtmpdump to stream the shows via HTTP. Rygel then publishes the guide via the UPnP media server protocol and provides MPEG2 versions of the streams if clients need them.
There are still a few rough edges though. The video from iView comes as 640×480 with a 16:9 aspect ratio so has a 4:3 pixel aspect ratio, but there is nothing in the video file to indicate this (I am not sure if flash video supports this metadata).
Getting Twisted and D-Bus to cooperate
Since I’d decided to use Twisted, I needed to get it to cooperate with the D-Bus bindings for Python. The first step here was to get both libraries using the same event loop. This can be achieved by setting Twisted to use the glib2 reactor, and enabling the glib mainloop integration in the D-Bus bindings.
Next was enabling asynchronous D-Bus method implementations. There is support for this in the D-Bus bindings, but has quite a different (and less convenient) API compared to Twisted. A small decorator was enough to overcome this impedence:
from functools import wraps import dbus.service from twisted.internet import defer def dbus_deferred_method(*args, **kwargs): def decorator(function): function = dbus.service.method(*args, **kwargs)(function) @wraps(function) def wrapper(*args, **kwargs): dbus_callback = kwargs.pop('_dbus_callback') dbus_errback = kwargs.pop('_dbus_errback') d = defer.maybeDeferred(function, *args, **kwargs) d.addCallbacks( dbus_callback, lambda failure: dbus_errback(failure.value)) wrapper._dbus_async_callbacks = ('_dbus_callback', '_dbus_errback') return wrapper return decorator
This decorator could then be applied to methods in the same way as the @dbus.service.method method, but it would correctly handle the case where the method returns a Deferred. Unfortunately it can’t be used in conjunction with @defer.inlineCallbacks, since the D-Bus bindings don’t handle varargs functions properly. You can of course call another function or method that uses @defer.inlineCallbacks though.
The iView Guide
After coding this, it became pretty obvious why it takes so long to load up the iView flash player: it splits the guide data over almost 300 XML files. This might make sense if it relied on most of these files remaining unchanged and stored in cache, however it also uses a cache-busting technique when requesting them (adding a random query component to the URL).
Most of these files are series description files (some for finished series with no published programs). These files contain a title, a short description, the URL for a thumbnail image and the IDs for the programs belonging to the series. To find out about those programs, you need to load all the channel guide XML files until you find which one contains the program. Going in the other direction, if you’ve got a program description from the channel guide and want to know about the series it belongs to (e.g. to get the thumbnail), you need to load each series description XML file until you find the one that contains the program. So there aren’t many opportunities to delay loading of parts of the guide.
The startup time would be a lot easier if this information was collapsed down to a smaller number of larger XML files.
In my last post, I said I had trouble getting Rygel’s tracker backend to function and assumed that it was expecting an older version of the API. It turns out I was incorrect and the problem was due in part to Ubuntu specific changes to the Tracker package and the unusual way Rygel was trying to talk to Tracker.
The Tracker packages in Ubuntu remove the D-Bus service activation file for the “org.freedesktop.Tracker” bus name so that if the user has not chosen to run the service (or has killed it), it won’t be automatically activated. Unfortunately, instead of just calling a Tracker D-Bus method, Rygel was trying to manually activate Tracker via a StartServiceByName() call. This would fail even if Tracker was running, hence my assumption that it was a tracker API version problem.
This problem will be fixed in the next Rygel release: it will call a method on Tracker directly to see if it is available. With that problem out of the way, I was able to try out the backend. It was providing a lot more metadata to the PS3 so more files were playable, which was good. Browsing folders was also much quicker than the folder back end. There were a few problems though:
- Files are exposed in one of three folders: “All Images”, “All Music” or “All Videos”. With even a moderate sized music collection, this is unmangeable. It wasn’t clear what order the files were being displayed in either.
- There was quite a long delay before video playback starts.
When the folder back end fixes the metadata and speed issues, I’d be inclined to use it over the tracker back end.
Getting video transcoding working turned out to require a newer GStreamer (0.10.23), the “unstripped” ffmpeg libraries and the “bad” GStreamer plugins package from multiverse. With those installed, things worked pretty well. With these dependencies encoded in the packaging, it’d be pretty painless to get it set up. Certainly much easier than setting things up in MediaTomb’s configuration file.
I promised Zeeshan that I’d have a look at his Rygel UPnP Media Server a few months back, and finally got around to doing so. For anyone else who wants to give it a shot, I’ve put together some Ubuntu packages for Jaunty and Karmic in a PPA here:
Most of the packages there are just rebuilds or version updates of existing packages, but the Rygel ones were done from scratch. It is the first Debian package I’ve put together from scratch and it wasn’t as difficult as I thought it might be. The tips from the “Teach me packaging” workshop at the Canonical All Hands meeting last month were quite helpful.
After installing the package, you can configure it by running the “rygel-preferences” program. The first notebook page lets you configure the transcoding support, and the second page lets you configure the various media source plugins.
I wasn’t able to get the Tracker plugin working on my system, which I think is due to Rygel expecting the older Tracker D-Bus API. I was able to get the folder plugin working pretty easily though.
Once things were configured, I ran Rygel itself and an extra icon showed up on my PlayStation 3. Getting folder listings was quite slow, but apparently this is limited to the folder back end and is currently being worked on. It’s a shame I wasn’t able to test the more mature Tracker back end.
With LPCM transcoding enabled, I was able to successfully play a Vorbis file on the PS3. With transcoding disabled, I wasn’t able to play any music — even files in formats the PS3 could handle natively. This was apparently due to the folder backend not providing the necessary metadata. I didn’t have any luck with MPEG2 transcoding for video.
It looks like Rygel has promise, but is not yet at a stage where it could replace something like MediaTomb. The external D-Bus media source support looks particularly interesting. I look forward to trying out version 0.4 when it is released.
Last week, we released the source code to django-openid-auth. This is a small library that can add OpenID based authentication to Django applications. It has been used for a number of internal Canonical projects, including the sprint scheduler Scott wrote for the last Ubuntu Developer Summit, so it is possible you’ve already used the code.
Rather than trying to cover all possible use cases of OpenID, it focuses on providing OpenID Relying Party support to applications using Django’s django.contrib.auth authentication system. As such, it is usually enough to edit just two files in an existing application to enable OpenID login.
The library has a number of useful features:
- As well as the standard method of prompting the user for an identity URL, you can configure a fixed OpenID server URL. This is useful for deployments where OpenID is being used for single sign on, and you always want users to log in using a particular OpenID provider. Rather than asking the user for their identity URL, they are sent directly to the provider.
- It can be configured to automatically create accounts when new identity URLs are seen.
- User names, full names and email addresses can be set on accounts based on data sent via the OpenID Simple Registration extension.
- Support for Launchpad‘s Teams OpenID extension, which lets you query membership of Launchpad teams when authenticating against Launchpad’s OpenID provider. Team memberships are mapped to Django group membership.
While the code can be used for generic OpenID login, we’ve mostly been using it for single sign on. The hope is that it will help members of the Ubuntu and Launchpad communities reuse our authentication system in a secure fashion.
The source code can be downloaded using the following Bazaar command:
bzr branch lp:django-openid-auth
Documentation on how to integrate the library is available in the README.txt file. The library includes some code written by Simon Willison for django-openid, and uses the same licensing terms (2 clause BSD) as that project.
On my way back from Canada a few weeks ago, I picked up a SanDisk Sansa Fuze media player. Overall, I like it. It supports Vorbis and FLAC audio out of the box, has a decent amount of on board storage (8GB) and can be expanded with a MicroSDHC card. It does use a proprietary dock connector for data transfer and charging, but that’s about all I don’t like about it. The choice of accessories for this connector is underwhelming, so a standard mini-USB connector would have been preferable since I wouldn’t need as many cables.
The first thing I tried was to copy some music to the device using Rhythmbox. This appeared to work, but took longer than expected. When I tried to play the music, it was listed as having an unknown artist and album name. Looking at the player’s filesystem, the reason for this was obvious: Rhythmbox had transcoded the music to MP3 and lost the tags. Copying the ogg files directly worked a lot better: it was quicker and preserved the metadata.
Of course, getting Rhythmbox to do the right thing would be preferable to telling people not to use it. Rhythmbox depends on information about the device provided by HAL, so I had a look at the relevant FDI files. There was one section for Sansa Clip and Fuze players which didn’t list Vorbis support, and another section for “Sansa Clip version II”. The second section was a much better match for the capabilities of my device. As all Clip and Fuze devices support the extra formats when running the latest firmware, I merged the two sections (hal bug 20616, ubuntu bug 345249). With the updated FDI file in place, copying music with Rhythmbox worked as expected.
The one downside to this change is that if you have a device with old firmware, Rhythmbox will no longer transcode music to a format the device can play. There doesn’t seem to be any obvious way to tell if a device has a new enough firmware via USB IDs or similar, so I’m not sure how to handle it automatically. That said, it is pretty easy to upgrade the firmware following the instructions from their forum, so it is probably best to just do that.
It seems to be a fashionable to blog about experiences with PulseAudio, I thought I’d join in.
I’ve actually had some good experiences with PulseAudio, seeing some tangible benefits over the ALSA setup I was using before. I’ve got a cheapish surround sound speaker set connected to my desktop. While it gives pretty good sound when all the speakers are used together, it sounds like crap if only the front left/right speakers are used.
ALSA supports multi-channel audio with the motherboard’s sound card alright, but apps producing stereo sound would only play out of the front two speakers. There are some howtos on the internet for setting up a separate ALSA device that routes stereo audio to all the speakers in the right way, but that requires that I know in advance what sort of audio an application is going to generate: something like Totem could produce mono, stereo or surround output depending on the file I want to play. This is more effort than I was usually willing to do, so I ended up flicking a switch on the amplifier to duplicate the front left/right channels to the rear.
With PulseAudio, I just had to edit the /etc/pulse/daemon.conf file and set default-sample-channels to 6, and it took care of converting mono and stereo output from apps to play on all the speakers while still letting apps producing surround output play as expected. This means I automatically get the best result without any special effort on my part.
I’m not too worried that I had to tell PulseAudio how many speakers I had, since it is possible to plug in a number of speaker configurations and I don’t think the card is capable of sensing what has been attached (the manual documents manually selecting the speaker configuration in the Windows driver). It might be nice if there was a way to configure this through the GUI though.
I’m looking forward to trying the “flat volume” feature in future versions of PulseAudio, as it should get the best quality out of the sound hardware (if I understand things right, 50% volume with current PulseAudio releases means you only get 15 bits of quantisation on a 16-bit sound card). I just hope that it manages to cope with the mixers my sound card exports: one two-channel mixer for the front speakers, one two-channel mixer for the rear two speakers and two single channel mixers for the center and LFE channels.
I’m in Montreal through to the end of next week. The sub-zero temperatures are quite a change from Perth, where it got up to 39°C on the day I left.
The last time I was here was for Ubuntu Below Zero, so it is interesting seeing the same city covered in snow.
Today was the first day of the mini-conferences that lead up to linux.conf.au later on this week. I arrived yesterday after an eventful flight from Perth.
I was originally meant to fly out to Melbourne on the red eye leaving on Friday at 11:35pm, but just before I checked in they announced that the flight had been delayed until 4:00am the following day. As I hadn’t had a chance to check in, I was able to get a pair of taxi vouchers to get home and back. I only got about 2 hours of sleep though, as they said they would turn off the baggage processing system at 3am. When I got back to the airport, I could see all the people who had stayed at the terminal spread out with airplane blankets. A little before the 4:00am deadline, another announcement was made saying the plane would now be leaving at 5:00am. Apparently they had needed to fly a replacement component in from over east to fix a problem found during maintenance. Still, it seems it wasn’t the most delayed Qantas flight for that weekend and it did arrive in one piece.
As I had planned to spend a day in Melbourne visiting relatives, it didn’t cause any problems with the flight on to Hobart. I had been invited to the “Ghosts” dinner, which was to start about an hour after my flight landed, so it was a bit of a rush to get to the university accommodation and then walk down the hill to the restaurant.
The dinner was pretty good, with organisers from all the previous LCA conferences plus the people organising the 2010 conference. Unfortunately, I was the only one from the 2003 organisers able to attend. It sounds like the 2010 organisers have things in hand, and the location should be great.