The endless Linux ready for the desktop discussion

Slashdot has a story from a guy giving up on using Linux on the desktop. When looking into it what he is actually giving up on is having a ‘rogue’ linux system seamlessly integrate into a Microsoft environment. The community is getting better all the time at providing technologies that will emulate your Windows software of choice, but unfortunatly Microsoft is not standing still so its chasing a moving target. So unless you have an IT organisation which is interested in truly supporting Linux you will proably always have some pains. On the other hand I do believe that for a lot of corporate environments Linux is ready for their desktop as long as they want Linux on their desktops. Meaning they tune their purchasing strategies and software choices to doing this.

That said I have myself been thinking about the state of the desktop. As I to have been part of this community for a long time now and there do always seem to be a new big hurdle to cross before we reach Desktop nirvana. Sometimes it can be a bit disencouraging. But if I look back to when I started out with Linux and compare the problems then to the problems we face today I realize that we have gotten a lot closer to being where we want to be. A lot of applications simply didn’t not exist on Linux when I started out. In fact the only two advanced desktop applications I can remember being at a level where they fullfilled my needs where Netscape and Gimp. Today there are a host of applications in most major categories. Applications that come to mind are Inkscape, OpenOffice, Evolution, Gaim, Ekiga, Totem and Rhythmbox.
For instance I remember getting my friends to switch to use Yahoo messenger as it was the only IM service I managed to get working under Linux at the time.

Hardware support has gotten a lot better. When I first installed Linux on my laptop back then I spent a full weekend to get basic audio working on it by grabing CVS versions of ALSA and trying to figure out magic options in the alsarc file. It also seemed like the only time hardware got detected under Linux at that point was during install so when I changed hardware I tended to end up re-installing Linux on the machine. That problem could have been avoided had I understood more of how linux worked back then, but today its not an issue as installing new hardware seems rather automated. I also remember spending a lot of time trying to get Linux to be able to read the ‘Joliet’ CD’s I burned under Windows. Linux at the time only supported another standard at the time (which there was only 1 burning application I ever saw which supported under Windows and that one was silly expensive).

CD burning under linux only existed in the form of the command line cdrecord application, which it took my quite some time to figure out.
Today my burning needs are mostly taken care of by Nautilus.

Hardware support is today as back then still an issue. But it is much less of an issue and hardware that is supported tend to be so in a much more ‘real’ way. Back when I started ‘supported’ meant ‘you can get it to work if you spend a few days on it’, today ‘supported’ means it will plug and play.

We have succeeded in creating standarized abstractions or subsystems for most things today. So when someone writes a new driver you don’t need your application to specifically support it anymore. As long as it is written towards this shared interface it will just work. I am sure there are still some subsystems which are not perfect yet in this regard, but lets face it we are currently working on fixing usability issues more often than we are looking into the ‘how can we get this type of thing to work somewhat at all’ kind of issues.

Its like libgimme-codec. We are not trying to solve the problem of ‘how do we get any kind of support for media playback’ anymore, instead we are trying to solve the problem of ‘how can we do this in the most userfriendly way possible, given the constraints we are facing’.

We might not be there 100% yet, but looking a few years back in time do make it clear we are moving forward rapidly. And I don’t think we need that many more major announcement like the recent Novell Peugeot Citroën deal before the hardware makers all realize that the Linux desktop is here to stay and needs to be supported properly.

Article with some quotes from me

Nathan Willis recently interviewed me about the codec shop launch Fluendo did.
The result is on
linux.com now
. The result was less of an interview and more an article than I expected, but I guess it still explains a few things around our shop launch.

A cool video blogging idea

A bunch of us from Fluendo where out yesterday at a local bar with Internet access. As such things go we started playing a bit with our Nokia 800’s and the video conferencing feature. What suddenly struck me is that it would be great to have a small application which connects to a googletalk account and just listens for incoming talks. You should then be able to call this account and the application will just accept the inncoming talk and then record it to disk, by muxing it into a container format and maybe transcoding it to Theora/Vorbis. That way you have a easy way to record little video blogs as you travel around and store these on your home computer. The next step then of course would be some kind of youtube extension that let you upload the video clips automatically to youtube or something like that :)

Just proposed the application to the Make it with Mono competition that Miguel blogged about recently. Hopefully it will make it through the moderation so I can try plugging it when the voting starts in April. Should be fairly easy to build this on top of GStreamer and Telepathy (which are also used in the Nokia 800)

The lagging fortunes of XaraLX

About a year ago the source code and linux port of XaraLX was announced to much fanfare and excitement. Unfortunatly things haven’t developed that well as time has gone by. First of all Xara has had limited resources to devote to the project themselves and there has been little in the form of a developer community formed from the outside. What I learned to my suprise today is that part of the reason for this is because Xara depends on a binary blob called cdraw/gdraw which is similar to pixman in Cairo. This among other things keep most distro’s from shipping it. Not sure what will happen with XaraLX now, but one hope would be that someone sat down and ported the XaraLX code from cdraw/gdraw to Cairo. And through that make it a truly free software application. Xara is interesting and should have a chance to fullfill a need alongside Gimp and Inkscape.

Nokia 800

A little later than the rest of the herd I got my Nokia 800 today. It is a worthy upgrade to the 770. Especially with Tigert’s Plankton theme it looks and feel very snazy.

My main frustration though from the 770 is still there, and that is that when entering new streams to the internet radio I have to have the actual stream uri, the playlist uri which is the most easy/common to find is not supported. That said finding the place to enter such uri’s is much easier now compared to the 770.

One feature we look to get into Pitivi is a Nokia 770/800 output profiles. So if you want to prepare a video to take on a trip you just choose the Nokia 770/800 profile and it will get transcoded into a video using the optimal combination of codecs and imgae size/framerate to play back on your device. New release of Pitivi out today btw, so be sure to check it out.

Miguel on ODF vs OOXML

Miguel has a very interesting blog entry discussing the ongoing ODF vs OOXML ISO debacle. Since he linked to an email I sent not long ago regarding SVG I felt a bit compelled to comment on that part of his essay. I agree that it is weird to attack OOXML on its lack of use of SVG when OpenOffice don’t support it, but instead use its own OOD format. On the other side just because ODF fucked up in this area doesn’t mean OOXML need to repeat the stupidity. Of course if they do end up using SVG then it would be a bit funny as suddenly for graphics at least OOXML would be a better and freer standard than ODF is :)
Miguel also points out the size of SVG as a problem with SVG, to which I agree, but the solution I have advocated for a long while within the librsvg community is to aim to support the SVG Mobile profile as it is for the most part the sensible subset of SVG we are all looking for. Speaking of librsvg it is in maintenance mode currently. Caleb who pushed many of the major changes for a long while has gone AWOL unfortunatly and Carl Worth is naturally putting most of his energy into Cairo itself. Dom is still around maintaining and holding the fort, but lacks the possibility to take librsvg the last steps to match the SVG Tiny profile. So if anyone out there is interested joining the librsvg team to flesh out the remaining holes in librsvg to actually conform fully with one the W3c SVG specs then please drop by #librsvg on Gimpnet or join the mailing list.

Dirac encoder/decoder coming along nicely

So we have been working on the Schroedinger Dirac implementation for some time now and its starting to come together now.
The decoder is pretty fast and works well and the encoder is getting quite close too although its default settings needs to be moved away from developer settings. Not exactly sure where we stand in terms of being compliant with the latest Dirac specification, but we should be quite close as most of Dave’s commits recently have been about taking us the last few steps towards compliance. Anyway here is a screenshot I took today which is showing a video I created using this pipeline:


gst-launch-0.10 filesrc location=Dolphins_1080.wmv ! decodebin2 name="decode" decode. ! ffmpegcolorspace ! videoscale method=1 qos=false ! "video/x-raw-yuv, width=(int)640, height=(int)480" ! ffmpegcolorspace ! schroenc ! queue ! oggmux name=mux ! gnomevfssink location=file:///tmp/dolphins.ogg decode. ! audioconvert ! vorbisenc ! queue ! mux.

Two things stands out in this screenshot, one is that we need the Dirac plugins to report their codec and bitrate to the GUI :) The second is that the encoder needs tweaking so we don’t get the blur shadow at the ‘borders’ between different items in the image.

Fun little odd game from Oddlabs

So from time to time I head over to LinuxGames.com to see the latest news and keep track of how linux as a gaming platform is evolving. Gamers have been one of those early adopter segments I have been hoping we would be able to lure to the linux platform at some point, but of course currently its mostly about wondering if the egg or the chicken will be the start of Linux as a competitive gaming platform.

Last week I found the game
Tribal Trouble
from Oddlabs.com. A small danish gaming company. Its a 3d real time strategy game available for Windows, Mac and Linux. The multiplatform support enabled due to the game being written in Java. It was an enlightening experience for me for a variety of reasons, one being that it is possible to write a 3D game like this in Java and get good performance out of it. It is not the first game I recently seen in Java and I do get the impression that there are quite a few of these Java based games out there which thus have a very low threshold for supporting Linux. Puzzle Pirates is another one of these new generation of games written in Java. With Sun’s recent decision to GPL their implementation of Java I think we have a great opportunity to integrate Java closely in the desktop to enable easy playing of games like these. Sun’s great work on integrating look and feel wise with GTK+ is of course another great boon. One thing I did find in the Oddlabs development blog was a mention that their paying customers was 47% Mac, 9% Linux and 44% Windows. Come on everyone, there has to be more people out there using linux interested enough in getting fun little games onto our favourite platform. Lets at least try to match the market for Mac software. Personally I have already bought the game and spent quite a few hours playing it :)

Tried eating an OLPC laptop?

So we have one of those cute little green OLPC laptops here at the Fluendo office. What suddenly struck me today is how much it looks like a children’s toy, which is appropriate considering who it is targeted at.
But I am sure things like Fischer Price toys go through a lot of child safety testing to make sure they for instance are not poisonous. So the question is have anyone tried eating parts of their OLPC to make sure we don’t risk killing any kids somewhere with it? Or do I need it pick an office volunteer to try eating some OLPC to make sure its truly safe for the worlds children?

Interviewed on The Linux Link Tech Show

Ok, so I was interviewed about our recent plugin show on the The Linux Link Tech Show. So if you are interested in hearing what Dan and Patrick managed to lure out of me then go to their page and download Episode 176. Topics include our new codecs, free codecs, DRM and other things we do at Fluendo. The sound, especially on my part being recorded with my on my cell phone and the output of that transcoded at least once if not twice. But hopefully I am interpretable :)

Embedded Linux Conference in Brazil?

Thought I should plug the great work done to organize the Bossa conference in Brazil in March. Bossa will bring together a lot of people from the embedded and mobile linux development community including GStreamer and Fluendo’s own Wim Taymans. Other participants includes Robert McQueen and Philippe Khalaf talking about Farsight and Telepathy, Marcel Holtmann on Bluez/Bluetooth, Chris Hofmann on Minimo and many more.

So be sure to sign up if you have interest in this field. For people outside South America maybe combinding Bossa with some days of vacation would be the perfect opportunity/excuse for visiting Brazil :)

GStreamer on the server side

At Fluendo we have been using GStreamer as the engine for the Flumotion streaming service for quite a while now. But it is nice to see that other companies using GStreamer on the server is starting to make their mark too. Seeing
this article on news.com
about Snocap making a deal to allow artists to sell music through MySpace reminded me that I know Snocap is also using GStreamer for their system. I don’t have any details, but the fact that all their job adds mention the need/advantage of having GStreamer experience is clear sign :). For those that don’t know Snocap its the latest venture of Shawn Fanning, the creator of Napster.

Anyway, anyone reading this using GStreamer to power their websites or services I would love it if you posted a comment on it. Always nice to hear how people use GStreamer to solve practical challenges.