Preparing the ground for the Fedora Workstation

Things are moving forward for the Fedora Workstation project. For those of you who don’t know about it, it is part of a broader plan to refocus Fedora around 3 core products with clear and distinctive usecase for each. The goal here is to be able to have a clear definition of what Fedora is and have something that for instance ISVs can clearly identify and target with their products. At the same time it is trying to move away from the traditional distribution model, a model where you primarily take whatever comes your way from upstream, apply a little duct tape to try to keep things together and ship it. That model was good in the early years of Linux existence, but it does not seem a great fit for what people want from an operating system today.

If we look at successful products MacOS X, Playstation 4, Android and ChromeOS the red thread between them is that while they all was built on top of existing open source efforts, they didn’t just indiscriminately shovel in any open source code and project they could find, instead they decided upon the product they wanted to make and then cherry picked the pieces out there that could help them with that, developing what they couldn’t find perfect fits for themselves. The same is to some degree true for things like Red Hat Enterprise Linux and Ubuntu. Both products, while based almost solely on existing open source components, have cherry picked what they wanted and then developed what pieces they needed on top of them. For instance for Red Hat Enterprise Linux its custom kernel has always been part of the value add offered, a linux kernel with a core set of dependable APIs.

Fedora on the other hand has historically followed a path more akin to Debian with a ‘more the merrier’ attitude, trying to welcome anything into the group. A metaphor often used in the Fedora community to describe this state was that Fedora was like a collection of Lego blocks. So if you had the time and the interest you could build almost anything with it. The problem with this state was that the products you built also ended up feeling like the creations you make with a random box of lego blocks. A lot of pointy edges and some weird looking sections due to needing to solve some of the issues with the pieces you had available as opposed to the piece most suited.

With the 3 products we are switching to a model where although we start with that big box of lego blocks we add some engineering capacity on top of it, make some clear and hard decisions on direction, and actually start creating something that looks and feels like it was made to be a whole instead of just assembled from a random set of pieces. So when we are planning the Fedora Workstation we are not just looking at what features we can develop for individual libraries or applications like GTK+, Firefox or LibreOffice, but we are looking at what we want the system as a whole to look like. And maybe most important we try our hardest to look at things from a feature/usecase viewpoint first as opposed to a specific technology viewpoint. So instead of asking ‘what features are there in systemd that we can expose/use in the desktop being our question, the question instead becomes ‘what new features do we want to offer our users in future versions of the product, and what do we need from systemd, the kernel and others to be able to do that’.

So while technologies such as systemd, Wayland, docker, btrfs are on our roadmap, they are not there because they are ‘cool technologies’, they are there because they provide us with the infrastructure we need to achieve our feature goals. And whats more we make sure to work closely with the core developers to make the technologies what we need them to be. This means for example that between myself and other members of the team we are having regular conversations with people such as Kristian Høgsberg and Lennart Poettering, and of course contributing code where possible.

To explain our mindset with the Fedora Workstation effort let me quickly summarize some old history. In 2001 Jim Gettys, one of the original creators of the X Window System did at talk a GUADEC in Sevile called ‘Draining the Swamp’. I don’t think the talk can be found online anywhere, but he outlined some of the same thoughts in this email reply to Richard Stallman some time later. I think that presentation has shaped the thinking of the people who saw it ever since, I know it has shaped mine. Jim’s core message was that the idea that we can create a great desktop system by trying to work around the shortcomings or weirdness in the rest of the operating system was a total fallacy. If we look at the operating system as a collection of 100% independent parts, all developing at their own pace and with their own agendas, we will never be able to create a truly great user experience on the desktop. Instead we need to work across the stack, fixing the issues we see where they should be fixed, and through that ‘drain the swamp’. Because if we continued to try to solve the problems by adding layers upon layers of workarounds and abstraction layers we would instead be growing the swamp, making it even more unmanageable. We are trying to bring that ‘draining the swamp’ mindset with us into creating the Fedora Workstation product.

With that in mind what is the driving ideas behind the Fedora Workstation? The Fedora Workstation effort is meant to provide a first class desktop for your laptop or workstation computer, combining a polished user interface with access to new technologies. We are putting a special emphasis on developers with our first releases, both looking at how we improve the desktop experience for developers, and looking at what tools we can offer to developers to let them be productive as quickly as possible. And to be clear when we say developers we are not only thinking about developers who wants to develop for the desktop or the desktop itself, but any kind of software developer or DevOPs out there.

The full description of the Fedora Workstation can be found here, but the essence of our plan is to create a desktop system that not only provides some incremental improvements over how things are done today, but which tries truly take a fresh look at how a linux desktop operating system should operate. The traditional distribution model, built up around software packages like RPM or Deb has both its pluses and minuses.
Its biggest challenge is probably that it creates a series of fiefdoms where a 3rd party developers can’t easily target the system or a family of systems except through spending time very specifically supporting each one. And even once a developers decides to commit to trying to support a given system it is not clear what system services they can depend on always being available or what human interface design they should aim for. Solving these kind of issues is part of our agenda for the new workstation.

So to achieve this we have decided on a set of core technologies to build this solution upon. The central piece of the puzzle is the so called LinuxApps proposal from Lennart Poettering. LinuxApps is currently a combination of high level ideas and some concrete building blocks. In terms of the building blocks are technologies such as Wayland, kdbus, overlayfs and software containers. The ideas side include developing a permission system similar to what you for instance see Android applications employ to decide what rights a given application has and develop defined versioned library bundles that 3rd party applications can depend on regardless of the version of the operating system. On the container side we plan on expanding on the work Red Hat is doing with Docker and Project Atomic.

In terms of some of the other building blocks I think most of you already know of the big push we are doing to get the new Wayland display server ready. This includes work on developing core infrastructure like libinput, a new library for handling input devices being developed by Jonas Ådahl and our own Peter Hutterer. There is also a lot of work happening on the GNOME 3 side of things to make GNOME 3 Wayland ready. Jasper St.Pierre wrote up a great blog blog entry outlining his work to make GDM and the GNOME Shell work better with Wayland. It is an ongoing effort, but there is a big community around this effort as most recently seen at the West Cost Hackfest at the Endless Mobile office.

As I mentioned there is a special emphasis on developers for the initial releases. These includes both a set of small and big changes. For instance we decided to put some time into improving the GNOME terminal application as we know it is a crucial piece of technology for a lot of developers and system administers alike. Some of the terminal improvements can be seen in GNOME 3.12, but we have more features lined up for the terminal, including the return of translucency. But we are also looking at the tools provided in general and the great thing here is that we are able to build upon a lot of efforts that Red Hat is developing for the Red Hat product portfolio, like Software Collections which gives easy access to a wide range of development tools and environments. Together with Developers Assistant this should greatly enhance your developers experience in the Fedora Workstation. The inclusion of Software collections also means that Fedora becomes an even better tool than before for developing software that you expect to deploy on RHEL, as you can be sure that an identical software collection will be available on RHEL that you have been developing against on Fedora as software collections ensure that you can have the exact same toolchain and toolchain versions available for both systems.

Of course creating a great operating system isn’t just about the applications and shell, but also about supporting the kind of hardware people want to use. A good example here is that we put a lot of effort into HiDPI support. HiDPI screens are not very common yet, but a lot of the new high end laptops coming out are using them already. Anyone who has used something like a Google Pixel or a Samsung Ativ Book 9 Plus has quickly come to appreciate the improved sharpness and image quality these displays brings. Due to the effort we put in there I have been very pleased to see many GNOME 3.12 reviews mentioning this work recently and saying that GNOME 3.12 is currently the best linux desktop for use with HiDPI systems due to it.

Another part of the puzzle for creating a better operating system is the software installation. The traditional distribution model often tended to try to bundle as many applications as possible as there was no good way for users to discover new software for their system. This is a brute force approach that assumes that if you checked the ‘scientific researcher’ checkbox you want to install a random collection of 100 applications useful for ‘scientific researchers’. To me this is a symptom of a system that does not provide a good way of finding and installing new applications. Thanks to the ardent efforts of Richard Hughes we have a new Software Installer that keeps going from strength to strength. It was originally launched in Fedora 19, but as we move forward towards the first Fedora Workstation release we are enabling new features and adding polish to it. One area where we need to wider Fedora community to work with us is to increase the coverage of appdata files. Appdata files essentially contains the necessary metadata for the installer to describe and advertise the application in question, including descriptive text and screenshots. Ideally upstreams should come with their own appdata file, but in the case where they are not we should add them to the Fedora package directly. Currently applications from the GTK+ and GNOME sphere has relatively decent appdata coverage, but we need more effort into getting applications using other toolkits covered too.

Which brings me to another item of importance to the workstation. The linux community has for natural reasons been very technical in nature which has meant that some things that on other operating systems are not even a question has become defining traits on Linux. The choice of GUI development toolkits being one of these. It has been a great tool used by the open source community to shoot ourselves in the foot for many years now. So while users of Windows or MacOS X probably never ask themselves what toolkit was used to implement a given application, it seems to be a frequently asked one for linux applications. So we want to move away from it with the Workstation. So while we do ship the GNOME Shell as our interface and use GTK+ for developing tools ourselves, including spending time evolving the toolkit itself that does not mean we think applications written using for instance Qt, EFL or Java are evil and should be exorcised from the system. In fact if an application developer want to write an application for the linux desktop at all we greatly appreciate that effort regardless of what tools they decide to use to do so. The choice of development toolkits is a choice meant to empower developers, not create meaningless distinctions for the end user. So one effort we have underway is to work on the necessary theming and other glue code to make sure that if you run a Qt application under the GNOME Shell it feels like it belongs there, which also extends to if you are using accessibility related setups like the high contrast theme. We hope to expand upon that effort both in width and in depth going forward.

And maybe on a somewhat related note we are also trying to address the elephant in the room when it comes to the desktop and that is the fact that the importance of the traditional desktop is decreasing in favor of the web. A lot of things that you used to do locally on your computer you are probably just doing online these days. And a lot of the new things you have started doing on your computer or other internet capable device are actually web services as opposed to a local applications. The old Sun slogan of ‘The Network is the Computer’ is more true today than it has ever been before. So we don’t believe the desktop is dead in any way or form, as some of the hipsters in the media like to claim, in fact we expect it to stay around for a long time. What we do envision though is that the amount of time you spend on webapps will continue to grow and that more and more of your computing tasks will be done using web services as opposed to local applications. Which is why we are continuing to deeply integrate the web into your desktop. Be that through things like GNOME Online accounts or the new Webapps that are introduced in Software installer. And as I have mentioned before on this blog we are also still working on trying to improve the integration of Chrome and Firefox apps into the desktop along the same lines. So while we want the desktop to help you use the applications you want to run locally as efficiently as possible, we also realize that you like us are living in a connected world and thus we need to help give you get easy access to your online life to stay relevant.

So there are of course a lot of other parts of the Fedora Workstation effort, but this has already turned into a very long blog post as it is so I leave the rest for later. Please feel free to post any questions or comments and I will try to respond.

Excited about Cockpit

So we had the DevConf conference here in Brno this weekend. One of the projects I am really excited about is Cockpit. Cockpit is a new server administration tool developed by Red Hat engineers which aims at providing a modern looking and userfriendly interface for your servers. There has been many such efforts over the years, but what I feel makes this one special is that it got graphical designers and interface designers involved, to ensure that the user experience is kept in focus instead of being taken hostage by underlaying APIs or systems. Too many such interfaces, be they web based or not tend to both feel and look clunky, for instance sometimes exposing features not because anyone realistically ever would want them, but because the underlying library happen to have a call for it.

Cockpit should also hopefully put the final nail in the coffin for the so called ‘server desktop’. The idea that you need to be able to run a graphical shell using X on your server adds a lot of pain with little gain in my opinion. The Fedora Server product should hopefully become a great showpiece for how nice a Linux server can be to use and configure when you have something like Cockpit available.

There was some nice videos showing what is already in Cockpit shown at the conference so hopefully they will be available online soon. In the meantime I recommend taking a look at the Cockpit web page.

A thank you to Google from Desktop Linux

We sometimes grumble a bit about Google in the community when they do things we feel are not generally helpful to the overall community. But I think we should be equally good at saying thanks when Google do great things. So thanks to our LibreOffice superstar Caolán McNamara I was made aware that Google has released two new open fonts along with Chrome. So what is so exciting about a new font you say?

Well one of them, called Carlito, is metrically compatible with the current MS default font called Calibri. You can get the font here. It is licensed under the OFL 1.1.

So for those wondering what metrically compatible means, I for sure did when I first heard the term, it basically mean that while the individual glyphs in the font doesn’t look like the Calibri font (that would not be legal), each individual letter has the same height and width as their Calibri counterpart. This means that if you import a document using Calibri into LibreOffice and you don’t have Calibri or a metrically compatible font installed, your document layout would change as the font LibreOffice would need to use instead have letters that might in general be slightly wider for instance. So with Carlito installed this will no longer be a problem, the glyphs might look a bit different, but you can be sure that the overall layout stays the same.

And for certain professions that can be crucial, for instance try speaking to the legal team of your company about them using LibreOffice and they are likely to tell you that they will only do that if they can feel certain that when another lawyer sends them a contract, the layout will not change when they view it, as such changes could at least potentially be the cause of a dispute over the meaning of a paragraph. (That worry was probably the main reason the legal profession stayed with Word Perfect for such a long time, when the rest of the world had moved on.)

So we are now going to get this new font packaged for Fedora and Red Hat Enterprise Linux so soon as possible, to make your productivity experience even better :)

So thank you Google, this is much appreciated!

Brno GUADEC Call for Papers!

I would like to give everyone a friendly reminder that Saturday the 27th of April is the official deadline for the GUADEC 2013 Call for Papers. So make sure to get your proposal submitted.

We hope to have a wide range of talks this year, including talks on related subjects such as Wayland and Multimedia, so don’t automatically assume that you will not get a talk approved because its not ‘pure GNOME’.

GUADEC this year will be in Brno in the Czech Republic, so I hope to see as many of you as possible here.

Open Source software and crowdsourcing

So thanks to crowd sourcing we are getting a lot of linux games coming out, titles such as Double Fine Adventure, Wasteland 2, Spaceventure, Project Eternity, Hero-U: Rogue to Redemption, Godus, Torment: Tides of Numenera,Arcade Racer and a lot more are all coming to the Linux platform thanks to crowdsourcing.

The question that one would want to answer is if crowd sourcing can work for open source projects creating useful software or other kind of tools. One project I am really hoping gets funded currently is Geary, the open source email client from the great guys at Yorba. Not that I personally are desperate for a new email client, I am a happy user of the Evolution email client, but I also know that email is a very personal thing for many people and an area where having different client offering different experiences might make a difference. Also proving that a non-profit outfit like Yorba can fund themselves through crowdsourcing is an important thing to demonstrate. So I have already pledged to Geary and I hope you will too.

They are not the only such project out there of course. Another project I have pledged to is the Phantom Open Emoji project which wants to create a full set of liberally licensed emoji/smileys covering the Unicode 6.0 set of over 800 such things. They only need to reach 25,000 USD in funding to create the full set, so I hope they can make that goal and we can then include support for these in Empathy.

The final project I want to mention is the OpenShot kickstarter. While I do think the best solution would be a GStreamer based one like PiTiVi for various reasons I am still happy that this kickstarter has reached is minimum funding goal already as it is does show success at funding open source development projects.

That said what strikes me is that the 3 open source projects above are all actually quite cheap, in the sense that the funding amounts they ask for are not high at all. And when you consider that games such as Torment reach 4 million USD in crowd sourcing funding one could wish that people would be a bit more prepared to bankroll open source projects. That said I guess just like games had their breakthrough moment with the Double Fine Adventure kickstarter, maybe open source development needs its own shinning star to lead the way. So if you haven’t already please pledge to one or more of the open source efforts above.

There will be more attempts of exploring this space though I am sure, I am even planning to be involved in one such effort myself, but more details on that later.

GStreamer Hackfest in Milan

As those of you following the GStreamer development mailing list or the GStreamer Google Plus profile know, we have been having a GStreamer hackfest in Milan over the last few days. We have 17 people here, all hammering away at our laptops or discussing various technical challenges sitting at a nice place called the Milan Hub.

A lot of progress has been made during these days with some highlights including work on fixing the use of Gnonlin with GStreamer 1.0, which is a prerequisite for getting PiTiVi and Jokosher running with GStreamer 1.0. Jeff Fortin, Thibault Saunier, Nicolas Dufresne, Edward Hervey, Peteris Krishanis and Emanuele Aina has all been helping out with this in addition to fixing various other issues in PiTiVi and Jokosher.

Sebastian Dröge has put a lot of work during the hackfest into providing the basic building blocks for doing hardware codecs nicely in GStreamer, and Víctor Jáquez has been working on making VAAPI work well using these building blocks, with the plan among other things to make sure you have hardware accelerated decoding working with WebKit. In that regards Philippe Normand has spent the hackfest investigating and improving various bits of the GStreamer backend in Webkit, like improving the on-disk buffering method used. Also in terms of hardware codec support Edward Hervey also found a bit of time to work a little on the VDPAU plugins.

Speaking of web browsers Alessandro Decina has been working on porting Firefox to GStreamer 1.0, he has also been our local host making sure we have found places to eat lunch and dinner that where able to host our big group. So a big thank you to Alessandro for this.

Wim Taymans has been working on properly dealing with chroma keying in GStreamer, improving picture quality significantly in some cases, in addition to being constantly barraged with questions and discussions about various enhancements, bugs and other challenges.

Edward Hervey has in addition to help out with GNonlin also been working on improvements in our DVB support and improving encodebin so that you can now request a named profile when requesting pads, the last item being a crucial piece in terms of allowing me to proceed with Transmageddons multistream support.

Stefan Sauer spent time on fixing various bugs in the GStreamer 1.0 port of Buzztard and a first stab at designing a tracing framework for GStreamer.

Arun Raghavan was working on various bugs related to Pulse Audio and GStreamer and also implemented a SBC RTP depayloader element for GStreamer.

Tim-Philipp Müller has been working on implementing a stream selection flag in order for GStreamer player to be able to follow any in-file hints about which streams to default to or to not default to for that matter.

As for myself I been mostly working on Transmageddon trying to get the multistream and DVD support working. Thanks to some crucial bugfixes from Edward Hervey and Wim Taymans I was able to make good progress and I have ripped my first DVD with Transmageddon now. There is still a lot of work that needs doing, both in terms of presentation, features and general robustness, but I am very pleased by the progress made.

transmageddon1
Title selection screen, needs a bit more polish, but getting there.

Transmageddon screenshot ripping a DVD
As you see above you can now choose to transcode to different codecs for each sound stream, or drop the streams you don’t care about. The main usecase for different codecs is to you a different codecs for surround sound as opposed to stereo or mono streams.

A big thank you to Collabora and Fluendo for sponsoring us with dinner during the hackfest.

Also a big thank you to Collabora, Fluendo, Google, Igalia, Red Hat and Spotify for letting their employees attend the hackfest.

Steam on Fedora – Lets get gaming!

As you probably already know Steam is now available for Linux. While it is currently officially only available for Ubuntu you can run it on Fedora too. Tom ‘Spot’ Callaway has made this yum repository available. So just put the .repo file you find there into your ‘/etc/yum.repos.d/’ directory and you should be able to do ‘yum install steam’. I been running this repository through the beta period and the games I tried so far works great. Crusader Kings 2 is my clear favourite so far.

A screenshot of Steam running on my Fedora 18 desktop –Screenshot of Steam running on Fedora

More hiring! Join the Red Hat desktop team and make a difference!

We are looking for some more people to join the Red Hat Desktop team. We have some flexibility on the tasks we need these new hires to do, so we are casting the net wider this time. We are open to candidates from anywhere in the world where Red Hat has an office. For the right candidate working from home is an option, but you would still need to live in a country where we have an office. That said candidates interested in joining the 500 people strong and growing team at the Brno office in the Czech Republic will be preferred, especially in cases where we have multiple candidates with similar skill levels.

We are looking for people who would be available to join Red Hat sometime this year, so if you are a student and graduating this summer you can still get in touch. We don’t have a hard list of requirements, but of course experience with any of the below items or similar will increase the likelihood of us being interested, and candidates with existing open source contributions will always be preferred over candidates who has never contributed to an open source project before.

  • GTK+/GNOME
  • C/C++/Python/Vala
  • OpenGL/Clutter
  • JavaScript/HTML5/WebRTC
  • X Windows
  • Touch screen technologies
  • GStreamer
  • D-Bus
  • LDAP/Active Directory

So if you want to join the worlds leading Linux company and help make the desktop rock, please send an email to Tyler Siprova who handles the hiring process for us in the Desktop team. She can be reached at tsiprova(at)redhat(dot)com. Be sure to refer to this blog entry in your email so she knows the context of your application. Be also be aware that I will be at the FOSDEM conference in Belgium in February, so if you are interested I would be happy to sit down with anyone interested to talk about the opportunities we have here at Red Hat, so if this is of interest be sure to request for such a meeting to be set up in your email to Tyler.

A Linux Game Changer?

So the Linux based Steam Gaming Console has been relased, or at least one version of it. It is called Piston and it seems quite nice looking.

Personally I think this device has a potential to truly transform the Linux desktop and gaming market. If this things takes off it could for instance make linux drivers the top priority for the makers for graphic chips. And people specalizing in gaming oriented high end PCs would also be likely to start offering those machines with Linux.

So I don’t know about you, but I will for sure buy one of these boxes when it comes out :)

Improved handling of files with multiple tracks in GStreamer

Thanks to Sebastian Dröge there is a new thing in GStreamer called streamid. It basically gives all streams inside a given file a unique id, making files with multiple streams a lot easier to deal with. This streamid is also supported by the GStreamer discoverer object. So once you identified the contents of a file with discoverer you can be sure to grab the exact stream you want coming out of (uri)decodebin by checking the pad for the streamid. The most common usecase for this is of course files with multiple audio streams in different languages.

From the output of Discoverer the stream id is really easy to get:
On the stream object you get out of Discoverer you just run a:

stream.get_stream_id()

On the pad you get from decodebin or uridecodebin the patch is a bit more convoluted, but not
to hard once you know how (there might be some kind of convenience API added for this at some point).

Before you connect the pad you get from the bin you attach a pad to it like this:

src_pad.add_probe(Gst.PadProbeType.EVENT_DOWNSTREAM, self.padprobe, None)


Then you in the function you define you can extract the stream_id with the parse_stream_start call as seen below:

def padprobe(self, pad, probeinfo, userdata):
       event = probeinfo.get_event()
       eventtype=event.type
       if eventtype==Gst.EventType.STREAM_START:
           streamid = event.parse_stream_start() 
       return Gst.PadProbeReturn.OK

I been using this code in my local copy of Transmageddon to start implementing support for files with multiple audio streams (also supporting multiple video streams would be easy, but I am not sure how useful it would be). Got a screenshot of my current development snapshot below, but I am still trying to figure out what would be a nice way to present it. The current setup will look quite crap if the incoming file got more than a few audio streams. Suggestions welcome :)

Transmageddon multistream  devshot

Transmageddon multistream development snapshot