Entries Tagged 'GNOME' ↓

The lagging fortunes of XaraLX

About a year ago the source code and linux port of XaraLX was announced to much fanfare and excitement. Unfortunatly things haven’t developed that well as time has gone by. First of all Xara has had limited resources to devote to the project themselves and there has been little in the form of a developer community formed from the outside. What I learned to my suprise today is that part of the reason for this is because Xara depends on a binary blob called cdraw/gdraw which is similar to pixman in Cairo. This among other things keep most distro’s from shipping it. Not sure what will happen with XaraLX now, but one hope would be that someone sat down and ported the XaraLX code from cdraw/gdraw to Cairo. And through that make it a truly free software application. Xara is interesting and should have a chance to fullfill a need alongside Gimp and Inkscape.

Nokia 800

A little later than the rest of the herd I got my Nokia 800 today. It is a worthy upgrade to the 770. Especially with Tigert’s Plankton theme it looks and feel very snazy.

My main frustration though from the 770 is still there, and that is that when entering new streams to the internet radio I have to have the actual stream uri, the playlist uri which is the most easy/common to find is not supported. That said finding the place to enter such uri’s is much easier now compared to the 770.

One feature we look to get into Pitivi is a Nokia 770/800 output profiles. So if you want to prepare a video to take on a trip you just choose the Nokia 770/800 profile and it will get transcoded into a video using the optimal combination of codecs and imgae size/framerate to play back on your device. New release of Pitivi out today btw, so be sure to check it out.

Miguel on ODF vs OOXML

Miguel has a very interesting blog entry discussing the ongoing ODF vs OOXML ISO debacle. Since he linked to an email I sent not long ago regarding SVG I felt a bit compelled to comment on that part of his essay. I agree that it is weird to attack OOXML on its lack of use of SVG when OpenOffice don’t support it, but instead use its own OOD format. On the other side just because ODF fucked up in this area doesn’t mean OOXML need to repeat the stupidity. Of course if they do end up using SVG then it would be a bit funny as suddenly for graphics at least OOXML would be a better and freer standard than ODF is :)
Miguel also points out the size of SVG as a problem with SVG, to which I agree, but the solution I have advocated for a long while within the librsvg community is to aim to support the SVG Mobile profile as it is for the most part the sensible subset of SVG we are all looking for. Speaking of librsvg it is in maintenance mode currently. Caleb who pushed many of the major changes for a long while has gone AWOL unfortunatly and Carl Worth is naturally putting most of his energy into Cairo itself. Dom is still around maintaining and holding the fort, but lacks the possibility to take librsvg the last steps to match the SVG Tiny profile. So if anyone out there is interested joining the librsvg team to flesh out the remaining holes in librsvg to actually conform fully with one the W3c SVG specs then please drop by #librsvg on Gimpnet or join the mailing list.

Fun little odd game from Oddlabs

So from time to time I head over to LinuxGames.com to see the latest news and keep track of how linux as a gaming platform is evolving. Gamers have been one of those early adopter segments I have been hoping we would be able to lure to the linux platform at some point, but of course currently its mostly about wondering if the egg or the chicken will be the start of Linux as a competitive gaming platform.

Last week I found the game
Tribal Trouble
from Oddlabs.com. A small danish gaming company. Its a 3d real time strategy game available for Windows, Mac and Linux. The multiplatform support enabled due to the game being written in Java. It was an enlightening experience for me for a variety of reasons, one being that it is possible to write a 3D game like this in Java and get good performance out of it. It is not the first game I recently seen in Java and I do get the impression that there are quite a few of these Java based games out there which thus have a very low threshold for supporting Linux. Puzzle Pirates is another one of these new generation of games written in Java. With Sun’s recent decision to GPL their implementation of Java I think we have a great opportunity to integrate Java closely in the desktop to enable easy playing of games like these. Sun’s great work on integrating look and feel wise with GTK+ is of course another great boon. One thing I did find in the Oddlabs development blog was a mention that their paying customers was 47% Mac, 9% Linux and 44% Windows. Come on everyone, there has to be more people out there using linux interested enough in getting fun little games onto our favourite platform. Lets at least try to match the market for Mac software. Personally I have already bought the game and spent quite a few hours playing it :)

Tried eating an OLPC laptop?

So we have one of those cute little green OLPC laptops here at the Fluendo office. What suddenly struck me today is how much it looks like a children’s toy, which is appropriate considering who it is targeted at.
But I am sure things like Fischer Price toys go through a lot of child safety testing to make sure they for instance are not poisonous. So the question is have anyone tried eating parts of their OLPC to make sure we don’t risk killing any kids somewhere with it? Or do I need it pick an office volunteer to try eating some OLPC to make sure its truly safe for the worlds children?

LightScribe for Linux

Some time ago I bought a USB DVD burner that supported this feature called lightscribe. Essentially it means you can buy DVD or CDROM’s with a special covering and then you can use the player laser to burn text and or images onto the disc. Looks kinda cool. As I expected back then the feature was not supported under Linux. But today I noticed that they actually released and SDK for linux which means CD burning applications or even graphics applications like the Gimp could potentially support it.

The SDK is available under a standard restrictive proprietary license though so don’t expect the functionality to be included with your average distro anytime soon unless some of the DVD burning software developers allows bundling with this non-free library in the license. There is a simple application available from their site, but unfortunatly it seemed unable to detect my drive so I couldn’t test if the lightscribe functionality actually do work.

There are of course two ways to look at this. Either one think that they supporting Linux is cool and help validate the platform for desktop use, even though their support is not free software. Or one considers the support worthless since it is not free software. Personally I do hope that this non-free library doesn’t stop people from trying to create opensource support for lightscribe burners, but in the meantime I do take their closed source support as a positive sign that the linux desktop is gaining in importance.

State of vector graphics support

Decided to look into the current state of vector graphics support
today. My original testcase was whether would be able to load a graphics into Inkscape then load then save and load the image into OpenOffice. As I tested I increased my target by doing various other tests testing interoperability. The origin for my testing was the hope that SVG support would be so commonplace and good now that we had achieved full interoperability beetween large parts of the desktop. Ended up testing a lot of random file formats and viewers.

I put together a page with my test results and the result was not exactly what I had hoped :)

Be aware that I don’t consider any of the results here as proof of anything except that as a normal user spending 2-3 hours on the problem this was as far as I got.

GNOME plans for the future

Noticed some tiny disturbance in the force before Christmas as
Thom Holwerda of osnews
posted an article about what he felt was the sorry state of free desktops. Seems most people in the GNOME camp simply ignored the article as irrelevant, but Aaron Segio of Trolltech and KDE let it somewhat get to him.

Personally I felt Thom kinda pointed out some troublesome points, but that his context and conclusion was wrong.

First of all he critized GNOME for not having a clear vision for GNOME 3. Well this is true, but that is mostly due to not having any clear ideas for something that would require a GNOME 3. GNOME 2 came about as a result of shortcomings in GTK+ at the time, causing the GTK+ maintainers having to break API compatability in order to improve for instance the handling various writing languages and fonts.

As part of having to port to the new GTK+ some policy changes where made for GNOME in terms of focus and goals. Goals and policies which people are still very happy with and don’t see a big need to change.

At the moment GNOME is doing quite well with incremental improvements with a lot of the major effort by GNOME contributors and companies going into projects such as HAL, X.org, Cairo, NetworkManager, GStreamer, Telepathy, OpenOffice, Firefox and Bluetooth support to mention a few. The thinking being that having a full featured office suite for instance is more important to potential users than having a panel that can be themed to have the shape of a sextant. At the same time the core parts of GNOME are continously moving forward with incremental improvements or replacements.

Why GNOME’s incremental approach is considered less by Thom than Apple’s is not clear to me, but for some reason he feels that unless you put a major version number behind something in the linux world you by definition stand still.

And to be honest incremental improvements is what everyone is doing these days. Windows Vista, MacOSX and KDE4 don’t really contain anything earth shattering, they are basically increamental improvements over the predecessors. Thom mentiones KDE’s Plasma, Appeal and Solid in his article as KDE4 efforts. Aasegio mentioned Phonon and Decibel as other examples. Well if you look at each of them, none of them are actually doing anything ‘new’, they are all just attempts at trying to do what is already being done, but in what each project maintainer feel is a better way. Which is just the same as how GNOME currently increments forward, although since GTK+ is not breaking API the need/motivation to call it GNOME3 is not very big. The excitement around Compiz recently showed how more glitz can be brought to the desktop as an incremental improvement.

The thing is that until we find a new way to interact with our desktops, nobody will be doing anything truly significantly new anytime soon, apart from maybe in the application space.

And to give an example of what I am talking of I want to point to the Nintendo Wii as a device which actually is doing something ‘new’. While parts of the technology has been around for quite a while the way the Wii controler works do truly change the way you interact with the system (making it much for physical for one) as compared to previous and competing consoles.

On the desktop space I think doing something I feel deserve the title ‘new’ will be harder, but I think the Lowfat experiement of Mirco has the potential. And if it pans out it might become the foundation and focus of a GNOME 3 cycle. But with all such experimental efforts we can’t commit to it before the proof of concept has reached a bit further so we know we can accomodate all the major usecases. And maybe in the end it will end up being more like what we at Fluendo try to do with Elisa, making it a add-on to the current desktop/system for a specific usecase rather than a full desktop replacement. Yet using many of the same building blocks as the desktop.

So while both GNOME and KDE could do with more developers I don’t see any truly dark clouds on the horizon. And if Thom or anyone else have any clear ideas on something that would require GNOME to change so many of its internals to justify switching the major version number to 3, then please come with them. In the meantime lets just continue incrementing our way towards perfection within the constraints of the current paradigm :)

The Magic Lamp of Standardization

Just saw an article on whats up next in Linux desktop standardization where there is blurb talking about the discussion at the latest ODSL Desktop Architects Meeting about the issues around Linux Audio (or multimedia in general if you want).

The final outcome was this: It was decided to start addressing these issues by creating a focus group and mailing a list of what’s needed from audio APIs, and how to deal with bringing consistency to Linux audio.

Which is exactly the kind of outcome a conference like this can ever hope to achieve and which also shows why it is a complete waste of time trying to address the issue in such a forum.

The DAM meetings are set up as a ‘lets get everyone together to see if we can find common ground’ type of meeting. Which isn’t bad, but it only lends itself to a certain type of problems.

The problem with Linux audio is not that people out there do not ‘know’ what the problem is, the problem is that people disagree, often quite strongly on how to solve it, and that there is limited resources available for solving it using any solution. So instead of actually moving towards solving it we get a lot of cute discussions on related topics such as when to push and when to pull as one example.

To solve the problems mentioned in the article you will need to step on some toes, sink a lot of man hours into the chosen solution and in the end win out on excellence. Which surprisingly is a bit harder than one would think :)

I think the ODSL-DAM context fails horribly already on the first hurdle, in the sense that the DAM conference is not set up to do any kind of toe stepping, rather to the contrary. Not that I am saying that the DAM type meetings are useless, not at all, but a format made for trying to increase cooperation between two quite mature and well established solutions like GNOME and KDE doesn’t necessarily lend itself well to a problem requiring some painful trailblazing.

That said I do think the problems are being addressed, but they are not and will not be addressed by things like ODSL DAM, instead they are being addressed by individual developers, contributing companies and distributions who are already all moving in mostly the same direction. The thing is however that for the contexts these people and groups are moving something like DAM is more of a distraction than a valuable contribution towards the end goal at this point in time.

To let on where I see things moving based on what the distro’s are doing I will say that on the audio side the solution that is gaining traction is PulseAudio which will be run on top of ALSA and providing legacy support for ESD and OSS. For the codec problem there are solutions being worked on around GStreamer.

Within the context of these technologies and some other technologies that comes as a consequence around them also longer term plans are made for addressing the issues faced. For instance making sure that Pulse Audio is an acceptable solution for both people doing so called pro-audio and for the normal desktop, like it is on MacOSX.

Ogg Trick modes!

Ok, so Wim went on a hacking spree this Sunday and implemented trick-modes with Theora video. This means you can now grab gst-plugins-base CVS and playback Theora in reverse using it. Elisa has some support for this already in subversion, but the best test application is the ‘seek’ example application under tests/examples/seek in gst-plugins-base. Been having some fun playing some of the Ogg Theora movies I have in fast-forward mode, slow motion mode, slow motion reverse and fast rewind mode by modifying the ‘rate’ value in ‘seek’. Vorbis sound is also speeding up/slowing down and playing the sound in reverse perfectly. I tried recording a screencast of it using Istanbul, but my machine is not even close to being fast enoungh to both do the trick mode playback and recording to Ogg at the same time :). A big thanks to Wim. Really looking forward to demonstrating this at future conferences and meetings.

With the support now implemented in a set of open source decoders and demuxers hopefully more people in the community will be able to help out with trick mode enabling the rest of our supported formats. Up to this point only the Fluendo Windows Media plugins supported trick modes, but that doesn’t really help anyone outside Fluendo understand how trick modes are supposed to work in GStreamer, with these changes the secret sauce is available for anyone to study alongside the already public
design document
:)

Update on the Texas Instruments OpenMax plugins

Texas Instruments sent in the first version of their OpenMax optimized
GStreamer plugins. Their plugins use the open source OpenMax implementation Bellagio/Omxil which was created by Semiconductor Technologies. ST plans on releasing a set of
GStreamer plugins of their own also as far as I know.

DVD and GStreamer

For a long time now things have been moving slowly in terms of
improving the state of DVD support with GStreamer. Tim Müller
has been working on improving it bit-by-bit over time, but being
only one person and having a lot of other tasks on his plate the
progress have been slow. Well know it seems some help has arrived
in the form of Jason Gerard DeRose who sent in information about his
new dvdread element to the mailing list last Friday and at the same time expressing interest in working also on working on the DVD navigation problemspace. Jason’s original reason for working on
the dvdread element was for his KungFu DVD ripper, but with the DVD stuff fresh in his head is willing to put time into improving also the playback side of things. A big thanks to Jason!

Proprietary 3D drivers

After Mark Shuttleworths invitation for Suse devs to switch to Ubuntu, which today he clarified, a lot of discussion has happened. A frequent topic that has come up in that discussion is the planed use of binary drivers for default in future Ubuntu releases for ATI and NVidia graphics cards. Well the binary graphics driver issue old news I say, I thought about that over 9 months ago ;)

NVidia/Fedora Quake crash

I would like to thank those who provided some feedback on my crash problem. It turned out to be a known bug in the latest released drivers and upgrading to the latest beta solved it. Thanks to the AC who pointed me in this direction.

No Quake3 in my FC6 world

Been running FC6 for a little while now and I am happy with it for the most part. Only irritating regression is that Quake3 don’t want to run anymore, it actually pulls down the whole X server when I try to run it. This is using the non-free Nvidia driver package from livna.org.

The xorg.0.log file doens’t give me much to work with, at least nothing that Google gives me anything on.

Backtrace:
0: /usr/bin/Xorg(xf86SigHandler+0x81) [0x80d4cc1]
1: [0x414420]
2: /usr/lib/xorg/modules/drivers/nvidia_drv.so(_nv001255X+0x38) [0x2ee5dc]
3: [0xa0fb8c0]

Fatal server error:
Caught signal 11.  Server aborting

Anyone seen and fixed this on their own system? I am using the quake3 version from Fedora Extras.

Linux Desktop and Games

Noticed a discussion on Slashdot on the state of Linux for games, spawned by a (not so good) article on Cedega.

One of the main arguments brought up which is probably true is that the PC gaming market is dying/declining, due to the increased popularity of consoles. It rhymes well with my own experience as those of my friends who do game a lot have basically switched from PC gaming to Playstation/Xbox gaming over the last two/three years. If you as a game company is moving your focus from PC’s to consoles anyway I guess looking at adding more ‘PC platforms’ to your supported list is quite far down the todo list.

That said there are still some major titles coming out with primarly the PC platform in mind and I don’t accept all the arguments made for why these don’t have a native linux port.

One argument I noticed cropping up was that of easy of porting between XBox and PC platform while the Linux/OpenGL/SDL/OpenAL port was harder. I doubt this is the real problem. For example I did expect more Linux games to come out when the Playstation 2 came out and used GCC and OpenGL due to ease of porting, but no such ports seemed to happen. Today MacOS X uses OpenGL and OpenAL on a Unix core with gcc, yet few of titles released for Apple also get a GNU/Linux port. So I think the Linux ports gets axed before the difficulty of porting question even arises.

Another question is if there are enough linux users out there to warrant a port, or at least enough linux users interesting in playing games to warrant a port. That is a hard question to answer. Loki Games did go under as many have pointed out, but in the aftermath its hard to say if it was mismanagement or lack of sales killing that company. Claims have been made in both directions. I would also hope that we have managed to grow the overall size of the linux userbase since the days of Loki which might have changed the dynamics if Loki where doing business today. There are other linux porting houses like Linux Games Publishing and Runesoft around and they seem to be surviving, even if they mostly do smaller titles. Transgaming looks like they are doing a healthy business currently, somewhat on the back of the enduring popularity of World Of Warcraft no doubt. So there definetly is a sustainable market for games and games related products on GNU/Linux. Based on some comments I saw from a Epic or Id person a couple of months ago I guess it is more of the ‘we don’t lose money on doing linux ports’ category though as opposed to ‘doing linux ports gives us a nice bundle of extra cash’. We need to get to the second of these two before the major game houses start paying attention I think.

Linux gaming is hampered still by shitty drivers for 3D, yet I am unsure about how direct impact this have on the lack of game ports. At the level the decision is taken at a company about wether to support Linux or not I don’t think there would be awareness of the state of Linux 3D drivers. NVidia’s proprietary drivers are probably the only ones out there that provides the quality and performance you want for playing newer titles. Intel’s drivers are good, but Intel is currently aiming at the low-end graphics market which kills them for a lot of the current games I think. ATI as many have pointed out provide really shitty Linux drivers. I don’t understand fully why they get away with it. I mean according to the grapevine the reason these drivers exist is due to the animation companies wanting them for their renderfarms. Well if that is true I don’t understand how said companies accept drivers which such horrid performance, being about 50% the speed of the same driver for Windows. Losing 50% performance on your renderfarm due to bad drivers would cause a lot of angry customers I would assume?

Anyway for someone contemplating a port, there might be some awareness that 3D accelleration under Linux has some kind of problems, even if the don’t know the details, which wouldn’t be helping their value estimation of the linux market of course. That said it seems to me people in the community are activly trying to buy NVidia or Intel using hardware these days, so hopefully the general image of bad 3D support will lessen over time due to that. It also has to be said in defence of ATI that it do seem like they are trying to improve their drivers currently. The release of AIGXL and XGL seems to have made them decide to put some more resources onto their drivers. Time will tell.

In regards to the general market size, I saw this
article today
which is Red Hat talking about Xen. More importantly to this entry though is that it also reports both Novell and Red Hat seeing rapidly growing interest in deploying GNU/Linux destops. As a digression I wonder how important the major GNU/Linux and Solaris vendors having standarized on GNOME is for this surge in interest. The Windows games market where built on the back of home office PC’s, so maybe that can/will be our path too.