So after a very long 1 year development cycle I finally managed to push a new release of Transmageddon. The main reason for it haven taken this long was because I decided to port to the brand new discoverer and encodebin elements, to greatly simplify the code and improve functionality. The thing was that Edward Hervey when he wrote those elements took the conscious decision to make them assume that all needed GStreamer plugins worked as they should to be perfect GStreamer citizens. As you might imagine, since many of the plugins had never been tested for all such behaviour a lot of things did not work after the port. But over the last Months I have filed bug reports and most of them are now fixed. And with todays new python-gstreamer release (0.10.22) the fix for the binding bug for encodebin is fixed and thus I decided it was time to put out a new release.
I am quite happy with my new feature list:
Port to new plugins-base discoverer and encodebin
Replace radiobutton lists with a combobox instead
add support for audio only transcoding
add support for outputting audio only from video+audio files
Support container free audio formats such as FLAC, mp3 and AAC
Add HTML5 and Nokia 900 profile
add support for video only transcoding
add support for mpeg1 video and mpeg2 audio
The new user interface should solve the problems people with small screens used to have with Transmageddon, like many netbook users. Transmageddon now also automatically deinterlaces deinterlaced video clips, I want to improve this feature in a later release, but making it optional to deinterlace and also use Robert Swains new elements that can help detect interlaced files that have been encoded as progressive files (as shown in his presentation in Prague).
Another major feature of this release is that audio only files are now officially supported, and you can also use Transmageddon to easily output just the audio from a audio+video clip.
I also added a HTML5 webm profile, so output to that should be easier than ever. In fact I used that profile when I transcoded 100GB of video that we had recorded at an internal session at Collabora.
My next goal is to port to GStreamer 1.0 and GTK3. That said if there turns out to be some brown paper bag issues in this release I will try to fix them and make a new release, but my guess is that most bugs people might encounter will be because there are issues that are only fixed in GStreamer git yet, so until they are all released not everything works 100% and until those releases are out there might be some small regressions from the previous release.
Anyway, I hope you head over to the Transmageddon website and grab the latest release for a test run. I will try to follow-up on bug reports, but might be a bit slow the next week as I am flying down to Lahore to celebrate my sister-in-laws wedding and also see my beautiful wife again after almost a Month apart.
Had an absolute blast and I am really happy the GStreamer Conference again turned out to be a big success. A big thanks to the platinum sponsor Collabora, and the two silver sponsors Fluendo and Google who made it all possible. Also a big thanks to Ubicast who was there onsite recording all talks. They aim to have all the talks online within a Month.
While I had to run a bit back and forth to make sure things was running smoothly, I did get to some very interesting talks, like Monty Montgomery from Xiph.org talking about the new Opus audio codec they are working on with the IETF and the strategies they are working on to fend of bogus patent claims.
On a related note I saw that Apple released their lossless audio codec ALAC as free software under the Apache license. Always nice to see such things even if ALAC for the most part has failed to make any headway against the already free FLAC codec. If Apple now would join the effort around WebM things would really start looking great in the codec space.
We did a Collabora booth during the LinuxCon and Embedded Linux days that followed the GStreamer Conference. Our demos showcasing a HTML5 video editing UI using GStreamer and the GStreamer Editing Services and video conferencing using Telepathy through HTML5 was a great success and our big screen TV running the Media Explorer media center combined with Telepathy based video conferencing provided us with a steady stream of people to our booth. For those who missed the conference all the tech demos can be grabbed from this Prague-demo Ubuntu PPA.
So as you might imagine I was quite tired by the time Friday was almost done, but thanks to Tim Bird and Sony I got a really nice end to the week as I won a Sony Tablet S through the Elinux Wiki editing competition. The tablet is really nice and it was the first tablet I ever wanted, so winning one was really great. The feature set is really nice with built in DLNA support, it can function as a TV remote and it has support for certain Playstation 1 titles. The ‘folded magazine’ shape makes it really nice to hold and I am going to try to use it as an e-book reader as I fly off to Lahore tomorrow morning for my sister-in-laws wedding.
First of all the first entry was using the term liberals, which I would suggest avoiding, because while in Europe it tends to mean people who are on the right wing of the political spectrum, because they are liberals in terms of economic policies, in North America a liberal is someone on the left side of the political spectrum as they tend to liberals in terms of social policies like gay marriage, abortion etc. Due to this I tend to try to refer to these groups as either social liberals, economic liberals (or libertarians if they are both for a free market and have socially liberal views.)
Anyway, I think the original article by Josselin was quite silly in its tone. There are a lot of reasons we have the current set of problems in the world economy, and economic liberal policies have only a small part of the blame. First of all the problems in Europe is, as Mathias says in his rebuttal, that you have mostly left leaning governments budgeting year after year after year with huge deficits, covering the deficits by pulling their countries deeper and deeper into debt. So when a crisis hits and it always will under any system, they have nothing to fall back on as they are already debted to the neck. Personally I think Keynesian policies is probably what is needed in a situation like this, but that is based on an assumption that they governments have made sure to have reserves or have low debt before a crisis hits, so they can use these reserves to help pull the economy through. In Europe the problem was that the governments where already so deep in debt that when they tried to stimulate their economies they just make it clear how much money they actually owed and how unable they would be in ever getting back on track again.
The US did something even more silly, they cut taxes and increased expenses at the same time. This idiot policy was called Reaganomics. And while it was definitively right wing in origin, I am not sure it can be easily placed in a libertarian versus socialist context. No matter if you are left wing or right wing, the most basic rule is that you keep your books in order. That means that if you want to increase spending you need to also increase your income, and if you want to decrease taxes and thus your income, you need to decrease your spending. The left wing governments of Europe only increased spending, but not income, and the right wing governments of the US only cut income, but not spending.
As for the depressed wages around the western world they are caused by a lot of different things, but the biggest of them all is that new technology combined with free trade agreements have undermined our ability to keep the rest of the world down in poverty and unable to develop their own industries. I assume we all accept this as a good thing in principle, but yes it does cause transitional pains and wage pressures here while we wait for the cost levels in the BRIC countries to catch up with us.
As for deregulation of the banking sector, which can be said to be a economically liberal policy, to be part of what caused the recent upheaval, yes it probably did play a part. But it does strike me as quite hypocritical that when the political drive to get people into the housing market, which was supported everywhere by people on both the left and the right, ends up getting banks in trouble due to the so called sub-prime loans collapsing, it is all about the banks and not about the politicians who supported such lending habits to support their own social policies. Subprime loans is not something impossible to understand, it is basically loans to poorer people. And politicians on both side of the aisle felt that encouraging house ownership was a great way to help raise such people into the middle class. And it worked quite well for a while in regard to improving the standard of life for a lot of people, but as we all know now, it was a bubble that had to burst at some point. Anyway, lending money to poorer people who might not be able to afford it, in order for them to buy their own house, doesn’t seem to be a clear fit under the headline ‘liberal economic policies’ to me.
I think most people agree that a ideologically pure system of any type is not likely to work or be deployable. In my view, the problem these days is that the system has moved from being one leaning towards being a free market economy, supporting entrepreneurship and healthy competition, to one of protecting established players. So we have policies through things like the tax system, government aid and the patent system, in most western countries, functioning in a way that protects established companies and disadvantages small companies who try to compete with them. I would rather define that as government lead capitalism and not free market liberalism.
So it doesn’t matter if we are talking about North America or Europe here, most times when politicians talk about new policies to help the economy or stimulate job growth, it is policies to support big established companies or groups. So every year these companies and industries grows more bureaucratic and less efficient, while more innovative and nimble startups faces an uphill struggle against the establishment.
So while I readily admit that there are a million other factors also playing in and that the root cause of issues wary from country to country somewhat, lets avoid silly name calling and pretend that populist policies is what is needed to improve the situation. Because if there is one ‘ism that is to blame for where we are at, it is not liberalism or socialism, it is populism. The populism supporting 5 minute news cycle which means politics is about 10 second long snazzy slogans and handsome faces, and not in-depth discussion or detailed review.
I am really excited about this years GStreamer Conference as we have a lot of ongoing efforts about to come to fruits. From Collabora we got Wim Taymans will be talking about GStreamer 1.0 effort, which we expect to have out before years end and Tim-Philipp Müller will speak about a lot of the other incredible advances we made over the last year. Being in the middle of it I think its easy to go a bit blind due to the gradual process, but things like the new parsing libraries that Thibault Saunier have been working on, which will enable much quicker and better support for things like libva and vdpau plugins in GStreamer. Or the new baseclasses that Mark Nauwelaerts have ported most of our plugins over to now, which in one fell swoop improved our plugin quality by leaps and bounds. And of course there are things like the GStreamer Editing services (GES), discoverer and encodebin which Edward Hervey created, which will make applications like Transmageddon video transcoder and remuxer and the PiTiVi video editor a lot easier to develop.
We will also be doing some real cool demonstrations of stuff we have been working on at Collabora at the Linux Con Showcase on Thursday. Thanks to GES we have a great demo of a mobile editing solution using either QML or HTML5. We have HTML5 video calling using Telepathy and we have Video calling using Telepathy from the Media explorer media center solution.
Another talk that I will be sure not to miss is Jan Schmidt who will be talking about Blu-Ray playback with GStreamer. In addition to being technically interesting Jans talks are always fun, like last year he did his presentation using GStreamer instead of something like LibreOffice, having created his slides as a DVD menu through a small program he wrote to turn SVG files into DVD menus
I have wanted to write about programming with GStreamer and Python for a while. Jono Bacon wrote a nice introduction to GStreamer and Python a long time ago, but I want to share with you some specific tips.
At Collabora we work a lot with GStreamer including helping train developers at our customers to be better at GStreamer development. Being the lowly marketing guy at the company I don’t have the programming chops to teach the hard stuff, but I figured I should be able to put together a very simple article which explains some basics and shows of a little GStreamer development trick I have used to great success in Transmageddon.
Part of what triggered getting this little tutorial done was that I am looking into porting Transmageddon to GTK3 after its next release of Transmageddon. To understand how to write a GTK 3 Python application, using the introspection bindings, I decided a good learning tool for myself would be to try to port the 0.0.1 version of Transmageddon. This version was never released, in fact it was me trying to figure out the very basics of programming with GTK+ and GStreamer in Python.
The application litterally consists of a GTK+ UI with two buttons. One is a ‘transcode’ button which when pressed starts a GStreamer transcoding pipeline. The other is my little secret trick, called ‘Debug’. It will when pressed generate a png of the pipeline being run, or not being run for that matter. It has helped me solve a ton of bugs and issues in Transmageddon since I started the project and hopefully it can be a useful trick for you too.
You can find a tarball here with the code below, the .ui file from Glade and a which.py file (which.py is a python version of the Unix which tool, which I found online).
First let me give you the code of the application, I tried to annotate the code in detail to make it easy to follow, even if you haven’t played with either GTK3 or GStreamer before.
# Simple example GTK3 + GStreamer 0.10.x Application for transcoding
# GTK3 using gobject introspection for bindings, GStreamer using manual bindings
# Also includes how to set up dotfile generation
# Setting GST_DEBUG_DUMP_DOT_DIR environment variable enables us to have a dotfile generated
os.environ["GST_DEBUG_DUMP_DOT_DIR"] = "/tmp"
from gi.repository import Gtk
# creating a basic transcoder class
self.pipeline = gst.Pipeline("TranscodingPipeline") # creating overall pipeline object
# Creating GStreamer filesrc element and sets it to read a specific mp3 file
self.filesrc = gst.element_factory_make("filesrc", "filesrc")
self.pipeline.add(self.filesrc) # add this first plugin to the pipeline object
# Use highlevel decodebin2 element to choose which GStreamer elments to use
# for decoding automatically
self.decoder = gst.element_factory_make("decodebin2", "decoder")
# Connect to signal that will let us know that decodebin2 got a pad we can connect
# to which has the decoded media file on it
# create an audioconvert element to convert bitrate if needed
self.audioconverter = gst.element_factory_make("audioconvert", "audioconverter")
# create audioencoder, in this case the Vorbis encoder
self.audioencoder = gst.element_factory_make("vorbisenc", "audioencoder")
# create ogg muxer to hold vorbis audio
self.oggmuxer = gst.element_factory_make("oggmux", "oggmuxer")
# create file output element to write new file to disk
self.filesink = gst.element_factory_make("filesink", "filesink")
# Now that all elements for the pipeline are create we link them together
# set pipeline to playing which means all the connected elements in the pipeline
# starts pushing data to each other
# create a simple function that is run when decodebin gives us the signal to let us
# know it got audio data for us. Use the get_pad call on the previously
#created audioconverter element asking to a "sink" pad.
def OnDynamicPad(self, dbin, pad, islast):
# extremely simple UI using a GtkBuilder UI generated with Glade, just two buttons.
# One to start transcode and one to run pipeline debug
self.builder = Gtk.Builder()
self.uifile = "supersimple-gtk3.ui"
self.window = self.builder.get_object ("MainWindow")
self.window.connect ("destroy", self.dialog_destroyed) # this allows the application
# to be cleanly killed
# Call the two buttons in the UI
self.transcodebutton = self.builder.get_object("transcodebutton")
self.debugbutton = self.builder.get_object("debugbutton")
# Connect to the clicked signal on both buttons
self.transcodebutton.connect ("clicked", self.on_TranscodeButton_clicked)
self.debugbutton.connect ("clicked", self.on_debug_activate)
# set window size to avoid it being so small it gets lost on the desktop
self.window.set_default_size (580, 435)
def on_TranscodeButton_clicked(self, widget):
self._transcoder = Transcoder()
def dialog_destroyed (self, dialog):
# this function generates the dot file, checks that graphviz in installed and
# then finally generates a png file, which it then displays
def on_debug_activate(self, widget):
dotfile = "/tmp/supersimple-debug-graph.dot"
pngfile = "/tmp/supersimple-pipeline.png"
if os.access(dotfile, os.F_OK):
if os.access(pngfile, os.F_OK):
gst.DEBUG_BIN_TO_DOT_FILE (self._transcoder.pipeline, \
# check if graphviz is installed with a simple test
dot = which.which("dot")
os.system(dot + " -Tpng -o " + pngfile + " " + dotfile)
Gtk.show_uri(None, "file://"+pngfile, 0)
print "The debug feature requires graphviz (dot) to be installed."
print "Transmageddon can not find the (dot) binary."
if __name__ == "__main__":
hwg = SuperSimpleUI()
The first thing happening in the file after importing the basis system classes and the which.py tool, is that we set the ‘GST_DEBUG_DUMP_DOT_DIR’ environment variable. When you set this value, GStreamer will be able to at any time dump the pipeline and elements to a ‘dot’ file, which can be turned into a nice looking png by the graphviz command line tool (should be available in most distributions).
Next I import GTK and GStreamer, as you see I don’t yet use the gobject introspection version of GStreamer as that is not fully working yet, but I plan to try to port this simple application to GStreamer 1.0, in which gobject introspection will be the supported way of using Python.
Next is setting up the GStreamer pipeline. You always start by creating a pipeline object, consider this the canvas onto which you will paint the GStreamer streaming pipelines. The next step is to assemble all the GStreamer plugins we want to use in the application. First I create a filesrc object pointing to the file I want to transcode, be sure to point that to a file of your own if trying this application. Next is creating the decodebin2 element. Decodebin2 is one of a set of high level elements in GStreamer, called bins, which contains a wide range of plugins inside. These high level elements are there to make things a lot simpler, and in the case of decodebin2 it will automatically put together the plugins needed to convert your incoming file to raw audio and video (or just demux the file). This means your input doesn’t need to be a mp3 file, like I used, as decodebin2 will reconfigure itself to handle any file you throw at it. After this I create a series of elements to enable me to encode the data into a Ogg Vorbis file. I am doing that to help explain how elements are stringed together, but there is another high level element, encodebin, which I could have used instead. Transmageddon uses encodebin in its git version.
Once all the elements are created you can think of them as boxes spread around on your pipeline canvas, but in order for GStreamer to know how you want to connect them together you need to link them together, as you can see I do with statements like ‘self.filesrc.link(self.decoder)’, which connects the filesrc element I created with the decoder element.
The one special element here is decodebin, which being a dynamic element I need to link it once the pad found signal is fired. Also to link I need to request a compatible pad from the element I am linking with, in this case the audioconverter element.
The last part of the GStreamer setup is setting the pipeline to playing state, which is the state where the pipeline is running. While not a big concern in this very simple application, dealing with state changes in GStreamer is going to be one of the major items you look out for. The GStreamer plugin writers guide contains a chapter discussing the basics of the four states "NULL", "READY", "PAUSED" and "PLAYING". Your pipeline (and all elements) always start at Null state and will go through each of the other stanges to reach Playing. So while we only set state to PLAYING in this simple application, GStreamer will in the background go through READY and PAUSED. The reason the intermediary states matter is because certain things happen at each, so for instance if you want to do some analysis of a file before starting to run your pipeline fully you want to be in PAUSED state as GStreamer will then start pulling the initial data through the pipeline and thus allow you to get information from your elements about the stream or file. One important thing to keep in mind as you develop more advanced applications is that the individual elements can have a different state than the pipeline, but when the state of the pipeline changes it will change the state of the plugins along with it, so you never want your pipeline to be more than one level lower than any of your elements, as that will cause the element to jump down to that state and thus lose the negotiation and information it had assembled.
I am not going to go into a lot of detail about the GUI, it is a very simple GTK user interface built using Glade, and hooked up using the GTK3 gobject introspection bindings. If you got any questions about it post a comment and I be happy to talk about it. What I want to talk about instead is the on_debug_activate function. I wrote this for Transmageddon, but my hope is that it will be useful for anyone writing a Python application with GStreamer (and I guess it shouldn’t be to hard to port to another language). It will allow you to add a menu entry or button in your application that outputs a png file, like the one you see below, which gives you a nice full view of the pipeline used by GStreamer. Especially if you use things like decodebin2 and encodebin, or have a lot of code dynamically adding/removing elements, it can be really useful to see what pipeline ended up being used. And if you have elements that you created, but forgot to link inn, they will appear as orphaned boxes in the file, allowing you to detect such issues. The important thing to remember is that it needs the graphwiz application to be installed on your system and available in the executable path.
Anyway, I hope this has been useful and I plan to post and updated version of this simple application, ported to use encodebin and GStreamer 1.0.
GStreamer news and more
Bad Behavior has blocked 703 access attempts in the last 7 days.