Recently people on the blogosphere and in the more mainstream media have been taking a look at what we’ve been doing on “Super-Mega-Ultra Revolutionary Desktop Environment Paradigm 4” and asking questions such as “Will it live up to the hype?”. Speaking frankly, we don’t know where this idea that Super-Mega-Ultra Revolutionary Desktop Environment Paradigm4 would be a super-mega-ultra revolution in the desktop environment paradigm has been coming from. Obviously Super-Mega-Ultra Revolutionary Desktop Environment Paradigm 4 won’t be a super-mega-ultra revolution in desktop environment paradigms and anyone who thinks that it possibly could did not understand the various blog posts we have written entitled “Super-Mega-Ultra Revolutionary Desktop Environment Paradigm 4 will be a super-mega-ultra revolution in desktop environment paradigms“, nor the conference talks entitled “The Super-Mega-Ultra Revolutionary Desktop Environment Paradigm is almost here” and missed the note on the website where we clearly denounced the idea that we were creating a new desktop environment paradigm. For those that missed it, I reproduce it in full here below:
“In creating The Super-Mega-Ultra Revolutionary Desktop Environment Paradigm 4 we will be creating a super-mega-ultra revolution in the desktop environment paradigm. This will truely be a desktop environment paradigm to surpass all others.“
I don’t think we could have been clearer in communicating our intentions. So if, when we release Super-Mega-Ultra Revolutionary Desktop Environment Paradigm 4 in October, you feel cheated and let down that all we’ve done is change the icons round a bit and copied the widgets idea from Apple, well then you’ve only got yourselves to blame.
Someone called Troy posted a link to Kathy Sierra’s diagram about incremental vs revolutionary improvements and said it explained why KDE4 was so important, why the Ubuntu and GNOME release methodology was doomed to failure and the meaning of life.
Now, to me, the diagram is worthless (and pretty ugly…) unless the “Big Frickin’ Wall” (1) or “Where you NEED to be” are defined. Troy has done neither, and just says that it shows that KDE4 is the right and only way to get over the wall. To me, KDE4’s methodology of clearing the wall is to take a large running jump and dive headfirst over it, leaving everything that you once had behind on the other side. That is certainly one way to clear it, and it is a very fun and exciting thing to do. The problem is that sometimes, if you don’t know what your target is, or whats on the other side of the wall you can miss and hit your head on the concrete.
There is another way to cross the wall and I present another ugly diagram to explain it (if I was any good with gimp/could be bothered I’d have made the steps fade gradually from the blue to the pink, but you get the idea…)
This way is safer, built on firm foundations and you still get to where you NEED to be and the steps taken are still revolutionary but revolutionary in small steps. The advantage of doing it this way is that its easy for people to follow you because they know they are still standing on what they are familiar with and in software terms things that they have already tested and found to be a good strong foundation.
So in response to Troy’s assertion that ” This is also why Gnome is just now starting to talk about their long term future”, the reason GNOME are starting to talk about their long term future goal is not so that we can take this running jump, but rather that we know what we’re aim for. And then once we know this, we can work out the best way to do it.
 Yey for mid90s Austin Powers inspired risqué humour.
 I also have issues with the implication that GNOME has not talked about their long term future before. I think in reality we set a goal way back in Copenhagen (in 2001, before gnome2) and have now seen that goal accomplished. Obviously now it is time to work out where to go to next.
GNOME is 10…wooo
But the 3rd birthday of GoneME passed without a mention.
Belated Happy Birthday GoneME!
The vermin allowed a thought to pass them by
We are sorry to announce the sad news…
Satire has died once again at Aug 08 2007, 10:01:41 AM PDT
The funeral will take place on Thursday, 1pm PDT from Mr Burridge’s house.
Some time ago I had a chance to talk with a carrot about the differences between mathematics in schools and mathematics in the future. The carrot was showing impressive demos using tiny fractions. One of the subjects we started arguing about was using CPU hardware to perform calculations as opposed to in the 1930s, where calculations always took place on the side of a piece of paper. The idea seemed tempting, though the practical benefits were unclear to me.
I decided to write a small benchmark to see what kind of speed differences we could be talking about. To see if the game is worth playing at all. As the testbed I’ve chosen one of the most fundamental bits of mathematics – addition. I implemented a 100% hw-accelerated version as a program running on the CPU. I compared it against a version written out on a piece of paper in two scenarios – myself doing the sum and my stuffed monkey. The following setup was used for the test
- Thinkpad T40p
- A time to add 100 random (pre-generated) numbers was measured
- Sums were randomized with numbers between 0 and 640 and used a random coloured crayon.
- Same set of sums was used in both examples
- Auntie Alice was put off
- An best of 3 test runs was taken
To cut a long joke short: Computer won, I was second and monkey still hasn’t finished. *SHOCK RESULT*