July 16, 2009
community, guadec, maemo
7 Comments
Note: I actually wrote something like this already in GNOME Blog, and a combination of the Intel graphics freezes in Jaunty and GNOME Blog not creating a local copy of in-progress entries cost me the lot. Funny that WordPress, a web-app, offers better transparent data retention across unexpected events than a local client. I have resolved to use Tomboy for drafting blog entries off-line now, and to figure out how to patch GNOME Blog to save drafts.
My Gran Canaria adventure started in a funny ha ha way when I got the airport and I was told I wasn’t on the plane which I had a ticket for. I checked my email to ensure I hadn’t received any schedule change emails, and found the last mail I received from Expedia, indicating I was booked on the 15h flight from Lyon to Madrid. But the friendly & helpful people at the Air France desk eventually figured it out, the airline had bumped me to an 8am flight, with a transfer to Gran Canaria arriving in the early afternoon. The travel agent wasn’t aware of it (I checked later when I got some internet access). So the Air France people asked if I minded flying through Bilbao, I said no (imagining they meant that I’d be flying from Lyon to Bilbao), and they checked in my bags, and gave me a boarding pass. For the plane to Madrid.
“I don’t understand”, I said. “We can’t issue you a boarding pass for the Madrid-Bilbao or Bilbao-Las Palmas legs now”, they explained. Ah. When I looked at the transfer times, and realised that (if we were on time) I would have 30 minutes to transfer in Bilbao, I was told that I would probably be able to get a boarding pass for the Bilbao-Las Palmas in Madrid.
When I got to Madrid, I queued behind some Swedes who were on their way to some holiday destination and had just been told that their flights were over-booked, and that they’d be staying in Madrid for the night. Happy happy joy joy. I also surprisingly ran into Alex Larsson, who was looking for a boarding card for his flight to Gran Canaria, which was delayed. I debated asking to get on the flight with him for a second, but figured that my bags wouldn’t make it even if I did, so I decided to play it safe.
The transfer desk in Madrid couldn’t issue me a boarding card for Las Palmas, so with 35 minutes transfer time, I would have to find the transfer desk, get a boarding pass, and hope that both my bags and I made it to the plane on time. I was not optimistic. After checking in, I bought a nice bottle for the SMASHED meeting, a Yamazaki 10yo.
Landed in Bilbao (the approach looks beautiful, I really want to visit the Basque country now), and found that there was no transfer desk. I had to go past security, with my newly purchased bottle of Yamazaki, check in, go back through security, and have my bags and I both make the plane. I have learned over time that the quality of hustle is important in airports. Relax when things are beyond your control, and when you can do something about it, run. So I ran. Headless chicken style.
An airport attendant who took pity on my cause very kindly brought me out through the security check-point, and I left my whiskey with the security guard. Ran to the first check-in counter I found to ask where I could check in for my flight. And by complete coincidence, the girl who was supposed to be manning the check-in desk had stepped away, since the flight was almost closed, to chat with her friend, who was minding the check-in desk I ran to.
Checked in, registered baggage tags to get the bags on the plane, back through security, got my whiskey back, ran to the plane, and (with take-of delayed a few minutes) felt much more confident about making it to the islands that night, with baggage in tow. Be thankful for the kindness of strangers. And it’s better to be born lucky than rich. All in all, a day made much better by the desire of everyone I met to be nice & helpful, in spite of the bureaucracy they work under.
Landed, picked up my bags, got a taxi to the hotel, and dropped them off. Said hello to someone with a laptop in the lobby (Hi mpt!), and ran to the welcome party to see if anyone was left, as it was now almost midnight local time.
I forgot this was Spain.
I met lots of people on the way. Lots of people (but no free beer) were still at the party. Talked briefly with Stormy, Lefty and family, Quim, Oskari, Henri, Sebas, Richard Dale, Rob Taylor and many more over a couple of nice beers. Thanks Canonical for the t-shirt and for the party, a great time was had. Home & to bed by 3. So endeth day 0.
July 15, 2009
community, gnome, guadec
13 Comments
First in a long series that will probably get finished next June, just in time for the next edition
Of course I was aware of the reaction to RMS’s keynote during the conference, and spoke about it with Lefty on a number of occasions.
I have been bothered by the creation of a “meme” which has, apparently, been perpetuated by people who weren’t even at the conference. The meme seems to be speaking more to Richard’s Mono comments (my opinion here) rather than the Emacs virgins segment, but it’s sufficiently ambiguous that I can’t tell.
If people are primarily concerned about the Mono comments, then say so – it’s not useful to conflate two issues. If you’re primarily concerned with the emacs virgin jokes, then for all those who weren’t at the keynote, or who don’t remember exactly what Richard said, go look at it now:
Aside: anyone know how to embed a youtube video on GNOME Blogs?
Richard is sufficiently predictable that he has been giving the same segment, word for word, for many years – last week was my third time to hear it – and to my knowledge this is the first time there has been such outcry.
Personally, I didn’t think it was offensive. As a born & bred (unbelieving) catholic, we’re big into the Virgin Mary ourselves, and while the “relieving them of their emacs virginity” line felt a bit awkward, I didn’t think that the segment was particularly offensive or inappropriate. I could see how others might feel uncomfortable, and so I have no problem with someone who did feel that way taking the point up with Richard directly. Go look at the video, and make up your own mind.
This is to underline a point: Offensiveness is in the eye of the beholder. It is dangerous to jump on a band-wagon about something as significant as whether someone was inappropriate or not if you were not there. I spoke to a number of people who were bothered by the speech, and many more who hadn’t noticed anything in particular, and who laughed along. It’s very easy to jump on a morally outraged bandwagon, without knowing what we’re talking about exactly.
I don’t mind people being morally outraged, I occasionally am myself, but at least make sure you are before you get in a huff. I have a lot more respect for Lefty, Chani and others who were at the conference than the sheep jumping on the issue as an easy way to take a pot-shot at the FSF and Richard Stallman. Oh – and for all the Boycott Novell crowd that are jumping on this as a way to get at people who support Mono, the same thing I said earlier goes for you too – conflating the issues isn’t helpful, in fact it’s inflammatory, stop harming our community with your bad behaviour.
By the way, the “Stop sexism” sign referred to a presentation in a rails conference, where a guy was using scantily clad glamour model shots to illustrate his talk about how “hot” rails was, IIRC. A bunch of rails heavyweights including DHH jumped in to defend him against the “thin-skinned” crowd. Is a parody of the christian church comparing an editor to a god really on the same scale? I dunno, maybe. Like I said, I can see how some people might not like it, but it didn’t bother me.
Can we move on now?
Update: Before moving on, one thing needs clarification. Let me emphasise one thing I said above: while I personally didn’t find RMS’s Emacs virgins segment offensive, I can see how others might. Taking someone to task because they were made uncomfortable by something is never acceptable. Accept that they were made uncomfortable, explain that it wasn’t intentional, apologise, move on. As I said, being offended is in the eye of the beholder. Other people are just as entitled to feel uncomfortable as you are to be unoffended. So to all those posting comments in Chani, Lefty and others’ blogs telling them to grow a thicker skin, get a life, or whatever other bile you’ve been spewing, think about that. And then don’t post the comment.
July 7, 2009
General, gnome, guadec
1 Comment
I was talking with Aaron Bockover yesterday and he told me that he wasn’t going to give the Silverlight talk which he had submitted back in March, and that he planned to give a presentation on something completely unrelated that he found interesting. Chris Blizzard suggested that he could give a lightning talk on amateur aeronautics, and as the idea spread a whole bunch of ideas on interesting non-GNOME related subjects that GNOME community members are interested in came up from architecture to running. There’s also a really valuable short talk on the burnout cycle (and how to break it) from Jono Bacon in there. So for 45 minutes, we will have a set of lightning talks reflecting the eclectic nature of the GNOME community – if you see Aaron and have something you are passionate about that you want to talk about for 3 to 5 minutes, grab him today or turn up at his session at 5:30 and shout.
And spread the word!
July 2, 2009
freesoftware, gimp, gnome, maemo, openwengo
43 Comments
The GNOME press contact alias got a mail last weekend from Sam Varghese asking about the possibility of new Mono applications being added to GNOME 3.0, and I answered it. I didn’t think much about it at the time, but I see now that the reason Sam was asking was because of Richard Stallman’s recent warnings about Mono – Sam’s article has since appeared with the ominous looking title “GNOME 3.0 may have more Mono apps“. And indeed it may. It may also have more alien technology, we’re not sure yet. We’re still working on an agreement with the DoD to get access to the alien craft in Fort Knox.
Anyway – that aside, Richard’s position is that it’s dangerous to include Mono to the point where removing it is difficult, should that become necessary to legally distribute your software. On the surface, I agree. But he goes a little further, saying that since it is dangerous to depend on Mono, we should actively discourage its use. And on this point, we disagree.
I’m not arguing that we should encourage its use either, but I fundamentally disagree with discouraging someone from pursuing a technology choice because of the threat of patents. In this particular case, the law is an ass. The patent system in the United States is out of control and dysfunctional, and it is bringing the rest of the world down with it. The time has come to take a stand and say “We don’t care about patents. We’re just not going to think about them. Sue us if you want.”
The healthy thing to do now would be to provoke a test case of the US patent system. Take advantage of one of the many cease & desist letters that get sent out for vacuous patented technology to make a case against the US PTO’s policy pertaining to software and business process patents. Run an “implement your favourite stupid patent as free software” competition.
In all of the projects that I have been involved in over the years, patent fears have had a negative affect on developer productivity and morale. In the GIMP, we struggled with patent issues related to compression algorithms for GIF and TIFF, colour management, and for some plug-ins. In GNOME, it’s been Mono mostly, but also MP3, and related (and unrelated) issues have handicapped basic functionality like playing DVDs for years. In Openwengo, the area of audio and video codecs is mined with patent restrictions, including the popular codecs G729 and H264 among others.
What could we have achieved if standards bodies had a patent pledge as part of their standardisation process, and released reference implementations under an artistic licence? How much further along would we be if cryptography, filesystems, codecs and data compression weren’t so heavily handicapped by patents? Or if we’d just ignored the patents and created clean-room implementations of these patented technologies?
That’s what I believe we need to do. Ignore the patent system completely. I believe strongly in respecting licencing requirements related to third party products and developer packs. I think it’s reasonable to respect people’s trademarks and trade secrets. But having respect for patents, and the patent system, is ridiculous. Let a thousand flowers bloom, and let the chips fall where they may.
So if you want to write a killer app in Mono, then don’t let anyone tell you otherwise. If you build it, they will come.
June 19, 2009
community, maemo
Comments Off on Maemo Summit – help make it great
This year, I’ve been asked to help with the content selection for the Maemo Summit, which will be held in October, in Amsterdam. We’re aiming for a very cool conference with lots of tips, tricks, hacks and general hardware coolness over 3 days.
Nokia is organising the first day, and the second and third days are entirely organised by the community. After a round of discussion, myself, Valerio Valerio and Jamie Bennett will be choosing content for the summit from among presentations proposed by the community. We’re aiming for presentations which will target three main audiences: tablet users, application developers and platform developers.
You can read more about the call for content or how to submit a presentation on the Maemo wiki. We’ve agreed on a fairly novel way of filling the schedule – we are starting from an empty grid, with three tracks, a couple of plenary sessions, and some lightning talks. As great talks come in, we will add them directly to the grid. If we don’t think that talks are up to scratch, they will be rejected, the submission will move to the Talk page for the Submissions wiki page, and if we are hesitant, the proposals will stay in the Submissions queue.
This has some great benefits over the usual call for papers/deadline/selection/publish the entire schedule scheme of things. Most proposers will know straight away whether their talk has been accepted, rejected, or converted into a lightning talk. Attendees will see the schedule building up and be able to propose sessions to account for topics that are not yet accounted for. And we will be able to keep some small number of slots until quite late in the organisation cycle for “late breaking news” – those great presentations that arrive too late for your deadline, but which you would really love to see get onto the schedule. And it is a kind of auction system – you have a great interest in getting your presentation proposal in early, rather than waiting for the last minute.
Anyway – let’s see how it works. You can follow the progress of the schedule on the wiki as well.
Good luck to all!
June 9, 2009
General
10 Comments
Warning: politics post
Since moving to France, the only elections I get to vote in here are the European and municipal elections – so on Sunday I blew the dust off my voter card & trotted down to my local “bureau de vote” as one of the 40% of the French electorate who voted. I had a chance to think about why the European elections inspire people so little.
In the past couple of weeks, debate about European issues has been mostly absent from newspapers and TV. What little we hear is more like celeb news – “he said, she said” or “the sworn enemies unite and appear on stage together pretending they like each other”. But to me, the fundamental questions about what we expect from Europe, and how a vote for one party or another will move towards that vision, are absent.
There are a few reasons for this – the political groupings in the EU parliament are detached from the local political landscape in France. Even the major groupings like EPP, PES, the Liberals and the Greens don’t have an identity in the election camaign. There is no European platform of note. Very little appears to be spent spent on advertising. In brief, the European election appears to the public to be nothing more than a mid-term popularity contest with little impact on people.
That is not to say that the EU has no impact. But the European parliament is quite hamstrung by the European law-making process, as we saw with the vote for the EUCD: in that case, the EU parliament was unhappy with the law proposed by the commission, and proposed many amendments which improved the law, only to see the majority of these reversed by the council of ministers. When the law came back to the parliament, there were three options available: accept the law, reject it outright (requiring an absolute majority of MEPs, difficult to obtain), or reject it by a majority (by proposing amendments) and send it into a commission, made up 50% of nominees from the council of ministers and 50% from the EU parliament.
The process is weighted toward the commission (which writes the law in the first place) and the council of ministers, who have veto power at every stage, and against the parliament, due to the requirement of an absolute majority for rejection in second reading. The commission and the council of ministers are both nominated by the governments of the member countries. I would argue that because of this, they don’t represent the European population, so much as they represent a cross-section of European political parties.
On other occasions, a stand-off between the governments and the EP is possible – as with the nomination of the Barroso commission in 2004. And when people are asked their opinion on the direction of Europe, as in the first referendum on the Nice treaty in Ireland, the French and Dutch referenda on the European constitution, and now the referendum on the Lisbon treaty in Ireland, if the result doesn’t match with what is supported by the member governments, a way is found to work around the result. In the case of a small country like Ireland, a couple of special case amendments, and you rerun the referendum. For the bigger countries like France, you renegotiate the form of the agreement so that it’s a treaty, not a single document (which, by the way, makes it harder to read and understand), so that you can ratify it with a working majority in parliament.
And so Europeans are slowly but surely distancing themselves from Europe. Fringe parties and independents representing a protest vote get very good scores, like the UKIP in the UK, or NPA and (until recently) the Front National in France. The European parliament is becoming less representative of European opinion, rather than more representative. Only 4 in 10 registered voters go to the polls. I would be willing to bet that Lisbon will not pass the second time around in Ireland, plunging Europe into another institutional crisis.
These are the twin problems facing Europe: the national governments in Europe are not representing the views of their citizens, and the only representative body we have is pretty ineffectual, even when they try to do something.
The solutions in my opinion: Elect commissioners and members of the council of ministers. Create Europe-wide political parties with Europe-wide campaigns, like in the US. Let the voters know what they’re voting for in the parliament, and allow them to vote the executive branch of the European government. The path to greater voter activity in Europe is greater voter inclusion in the electoral process.
May 26, 2009
General
9 Comments
I recently upgraded from Ubuntu 8.10 to 9.04 and in the process “cleaned up” the distro using the very useful option to “make my system as close as possible to a new install” (I don’t remember if that’s the exact text, but that was the gist of it). Last night, I tried to use the printer in my office for the first time since upgrading, an Epson Stylus Office BX300F (all in one scanner/printer/copier/fax).
With 8.10, I finally got printing working – I don’t remember the details, but I do recall that I had to install pipslite and generate a new PPD file to get a working driver for the printer, which I found through the very useful OpenPrinting.org website. It’s a fairly new printer, on the market since September 2008 as far as I can tell, cheap, and part of a long-running series from Epson (the Linux driver available for download on the Epson site is dated early 2007).
Nonetheless I was reassured by OpenPrinting’s assurance that the printer and scanner “work perfectly”, and I wasn’t expecting to have to download a source package, install some development packages, and compile myself a new Ubuntu package to get it working. And then discover that there was a package available already that I just hadn’t found. But anyway, that was then…
When I upgrade my OS, I have a fairly simple expectation, that changes I have to make to the previous version to “fix” things don’t get broken post-upgrade. There are some scenarios where I can almost accept an exception – a few releases ago, I had problems with Xrandr because changes I had previously had to make to get my Intel hardware working properly were no longer necessary as X.org integrated and improved the driver – but it took me a while to figure out what was happening, and revert my Xorg config to the distro version.
Yesterday, when I had to print some documents, I got a nice error message in the properties of the printer that let me know I had a problem: “Printer requires the ‘pipslite-wrapper’ program but it is not currently installed. Please install it before using this printer.” And thus began the yak-shaving session that people could follow on twitter yesterday.
- Search in synaptic for pipslite – found – but: “Package pipslite has no available version, but exists in the database.” Gah!
- Try to find an alternative driver for the Epson installed on the system: no luck. Hit the forums.
- Noticed that libsane-backends-extra wasn’t installed, installed it to get the epkowa sane back-end, and “scanimage -L” as root worked (for the first time) – so went on a side-track to get the scanner working as a normal user
- Figure out what USB node the scanner is, chgrp scanner, scanning works!
- Then figure out how the group gets set on the node on plugging, found the appropriate udev rules file (/lib/udev/rules.d/40-libsane-extras), copied it to /etc/udev/rules.d, added a new line to get the scanner recognised (don’t forget to restart udev!) scanning works!
- Re-download a driver from the website linked to in OpenPrinting’s page for the printer – they have a .deb for Ubuntu 9.04! Rock!
- Install driver, error message has changed, but still no printing: “/usr/lib/cups/filter/pipslite-wrapper failed”. Forums again.
- Tried to regenerate a PPD file: pipslite-install: libltdl.so.3 not found. ls -l /usr/lib/*ltdl*: libltdl.so.7 – Bingo! The pre-built “Ubuntu” binaries don’t link to the right versions of some dependencies.
- Download the source code, compile a new .deb (dpkg-buildpackage works perfectly), install, regenerate .ppd file, (don’t forget to restart CUPS), and we have a working printer!
4 hours lost.
Someone will doubtless follow up in comments telling me how stupid I was not to [insert some “easy” way of getting the printer working] which didn’t involved downloading source code and compiling my own binary package, or fiddling about in udev to add new rules, or sullying my pristine upgrade with an unofficial package. Please do! I’m eager to learn. And perhaps someone else with the same problems will find this blog entry when they look for “Ubuntu Epson Stylus Office BX300F” and won’t have to figure things out the hard way like I did.
Please bear in mind when you do that I’m not a neophyte, that I’ve got some pretty good Google-fu, and that I’ve been using Linux for many many years – and it took me 4 hours to re-do something I’d already done once 6 months ago, and wasn’t expecting to have to do again. How much harder is it for a first timer when he buys a USB headset & mic, or printer/scanner, or webcam?
Update: After fixing the problem, I have discovered that the Gutenprint driver mentioned on the OpenPrinting page (using CUPS+Gutenprint) does work with my printer. It seems that if I had done a fresh install, rather than an upgrade, I would not have had this existing printer using a no longer installed “recommended” driver – as John Mark suggested to me on twitter, pipslite is no longer necessary. In addition, when I tested both drivers with the same image, there is a noticeable difference in the results – the gutenprint driver appears to use a higher alpha, resulting in colours being much lighter in mid-tones. The differences are quite remarkable.
May 20, 2009
community, freesoftware, maemo
6 Comments
Fabrizio Capobianco of Funambol wondered recently if there are too many mobile Linux platforms.
The context was the recent announcement of oFono by Intel and Nokia, and some confusion and misunderstanding about what oFono represents. Apparently, several people in the media thought that oFono would be Yet Another Complete Stack, and Fabrizio took the bait too.
As far as I can tell, oFono is a component of a mobile stack, supplying the kind of high-level API for telephony functions which Gstreamer does for multimedia applications. If you look at it like this, it is a natural compliment to Moblin and Maemo and potentially a candidate technology for inclusion in the GNOME Mobile module set.
Which brings me to my main point. Fabrizio mentions five platforms besides oFono in his article: Android, LiMo, Symbian, Maemo and Moblin. First, Symbian is not Linux. Of the other four, LiMo, Maemo and Moblin share a bunch of technology in their platforms. Common components across the three are: The Linux kernel (duh), DBus, Xorg, GTK+, GConf, Gstreamer, BlueZ, SQLite… For the most part, they use the same build tools. The differences are in the middleware and application layers of the platform, but the APIs that developers are mostly building against are the same across all three.
Maemo and Moblin share even more technology, as well as having very solid community roots. Nokia have invested heavily in getting their developers working upstream, as has Intel. They are both leveraging community projects right through the stack, and focusing on differentiation at the top, in the user experience. The same goes for Ubuntu Netbook Edition (the nearest thing that Moblin has to a direct competitor at the moment).
So where is the massive diversity in mobile platforms? Right now, there is Android in smartphones, LiMo targeting smartphones, Maemo in personal internet tablets and Moblin on netbooks. And except for Android, they are all leveraging the work being done by projects like GNOME, rather than re-inventing the wheel. This is not fragmentation, it is adaptability. It is the basic system being tailored to very specific use-cases by groups who decide to use an existing code base rather than starting from scratch. It is, in a word, what rocks about Linux and free software in general.
May 18, 2009
community
3 Comments
Recently I’ve had a number of conversations with potential clients which have reinforced someting which I have felt for some time. Companies don’t know how to evaluate the risk associated with free software projects.
First, a background assumption. Most software built in the world, by a large margin, is in-house software.
IT departments of big corporations have long procurement proceses where they evaluate the cost of adopting a piece of infrastructure or software, including a detailed risk analysis. They ask a long list of questions, including some of these.
- How much will the software cost over 5 years?
- What service package do we need?
- How much will it cost us to migrate to a competing solution?
- Is the company selling us this software going to go out of business?
- If it does, can we get the source code?
- How much will it cost us to maintain the software for the next 5 years, if we do?
- How much time & money will it cost to build an equivalent solution in-house?
There are others, but the nub of the issue is there: you want to know what the chances are that the worst will happen, and how much the scenario will cost you. Companies are very good at evaluating the risk associated with commercial software – I would not be surprised to learn that there are actuarial tables that you can apply, knowing how much a company makes, how old it is and how many employees it has which can tell you its probability of still being alive in 1, 3 and 5 years.
Companies built on free software projects are harder to gauge. Many “fauxpen source” companies have integrated “community” in their sales pitch as an argument for risk mitigation. The implicit message is: “You’re not just buying software from this small ISV, if you choose us you get this whole community too, so you’re covered if the worst happens”. At OSBC, I heard one panellist say “Open Source is the ultimate source escrow” – you don’t have to wait until the worst happens to get the code, you can get it right now, befor buying the product.
This is a nice argument indeed. But for many company-driven projects, it’s simply not the case that the community will fill the void. The risk involved in the free software solution is only slightly smaller than buying a commercial software solution.
And what of community-driven projects, like GNOME? How do you evaluate the risk there? There isn’t even a company involved.
There are a number of ways to evaluate legal risk of adopting free software – Black Duck and Fossology come to mind. But very little has been written about evaluating the community risks associated with free software adoption. This is closely associated to the community metrics work I have pointed to in the past – Pia Waugh and Randy Metcalfe’s paper is still a reference in this area, as is Siobhan O’Mahony and Joel West’s paper “The Role of Participation Architecture in Growing Sponsored Open Source Communities”.
This is a topic that I have been working on for a while now in fits and starts – but I think the time has come to get the basic ideas I use to evaluate risk associated with free software projects down on paper and out in the ether. In the meantime, what kinds of criteria do my 3 readers think I should be keeping in mind when thinking about this issue? Do you have references to related work that I might not have heard about yet?
May 11, 2009
running
12 Comments
Yesterday, on my second serious attempt (previously I injured myself 4 weeks before the race) I finally ran a marathon in Geneva, Switzerland.
Since getting injured in 2007, I’ve taken up running fairly seriously, joined a club, and this time round I was fairly conscientious about my training, getting in most of my long runs, speed work & pace runs as planned. I thought I was prepared, but I don’t think anything can prepare you for actually running 42.195 kilometers at race pace. Athletes will tell you that the marathon is one of the hardest events out there because it’s not just a long-distance race, it’s also a race where you have to run fast all the time. But until you’ve done it, it’s hard to appreciate what they mean.
This year, the club chose the Geneva marathon as a club outing, and around 40 club members signed up for either the marathon or the half-marathon on the banks of Lac Leman, and I couldn’t resist signing up for the marathon.
I wasn’t in perfect health, since I’ve been feeling some twinges in my right hip & hamstring for the past couple of weeks, but during taper before the race I’ve been taking it very easy, and I felt pretty good the day before. With the club we met up on Saturday 9th after lunch, and drove to Geneva to get our race numbers, and then to the hotel in Annemasse for a “special marathon runner’s” dinner (which had a little too much lardons, vinaigrette & buttery sauce to be called a true marathon runner meal), last minute preparations for the big day, and a good night’s rest.
Up early, light breakfast, back into Geneva for the race. Arrived at 7am, lots of marathon runners around, and the excitement levels are starting to climb. After the usual formalities (vaseline under armpits and between thighs, taped nipples, visit to toilet) we made our way to the starting line for the 8am start.
Nice pace from the start – a little fast, even, but by the 3rd kilometer I’d settled into my race pace, at around 4’40 per kilometer (aiming for 3h20 with a couple of minutes margin). Walked across every water station to get two or three good mouthfuls of water and banana without upsetting my tummy. Around kilometer 7, I started to feel a little twinge in the hamstring and piriformis/pyramidal muscle, and I felt like I might be in for a long day. It didn’t start affecting me for a while, but by kilometer 16, I was starting to feel muscles seize up in my hip in reaction to the pain.
First half completed on schedule, 1h38’55, and I was feeling pretty good. Not long afterwards, every step was getting painful. Around kilometer 26, I decided (or was my body deciding for me?) to ease off on the pace a little and I started running kilometers at 4’50 to 5′.
They talk about the wall, but you don’t know what they mean until you hit it. Around kilometer 32, I found out. At first, I welcomed the feeling of heavy legs – it drowned out the pain from my hip, and here was a familiar sensation I thought I could manage. But as the kilometers wore on, and my pace dropped, I was having a harder and harder time putting one foot in front of the other. Starting again after walking across a water stop at kilometers 33 and 38 was hard – it was pure will that got me going again. My pace was slipping – from 5′ to 5’30 – one kilometer I ran in 6′. It looked like I was barely going to finish in 3’30, if I made it to the end at all.
Then a club-mate who was on a slower pace caught up to me, tapped me on the shoulder, and said “Hang on to me, we’ll finish together” (“accroche toi, on termine ensemble”). A life-saver. Manna from heaven. I picked up speed to match him – if only for 100m. After that, I said to myself, I’ll try to keep this up for another kilometer. When we passed the marker for 40k, I said I’d make it to 41 with him, and let him off for the last straight. And when we got to the final straight, I summoned up everything I had left to go for the last 1200m.
In the end, I covered those last 3200m in an average of 4’35 per kilometer – which just went to teach me that those 5km when I was feeling sorry for myself were more mental blockage than anything else, and I was able to overcome my body screaming out at me to stop.
The record will show that I ran 3h26’33 for my first marathon, but that doesn’t come close to telling the story.
Afterwards, I got a massage, drank a lot of water, ate some banana, and, feeling emptied & drained, a wave of emotion overcame me when I realised what I’d done.
Congratulations to the other first-time marathon runners who ran with me yesterday, and thank you Paco, I’ll never forget that you got me to the end of my first marathon.
Update: The marathon organisers had a video camera recording everyone’s arrival during the race. I discovered this afterwards, otherwise I might have been slightly more restrained after crossing the line.
You can see me arriving here, and Paco, who arrived a few seconds after me, here – for the extended sound-track.
« Previous Entries Next Entries »