April 29, 2010
community, gimp, gnome
22 Comments
When I first installed a Linux distribution in 1996 or 97, it was a horrible experience. My friend who had started a few months before me thought it was great, though – I remember, it was Red Hat 5, the first version that had an ncurses installer that helped you through the process.
I was confronted with dozens of questions I knew nothing about and wasn’t equipped to answer. Did I want to create a primary or secondary partition? What was its mount point and filesystem type going to be? What was the manufacturer of my video card? What resolution & refresh rate did I need for my monitor (WARNING: the wrong answer can make your monitor explode!)? What was my IP address, netmask, gateway, DNS server? What keymap did I want for my keyboard? Was my NIC using ISA or PCI? Was it 10baseT or 100baseT? Which driver did it need? Was my mouse a PS/2, serial, Microsoft or “Other” (there was always an “Other”)? And on and on it went. How the hell did I know? What did I care?
But it was a learning experience. Installing Linux was the period in my life where I learned the most about how computers worked, hardware and software. Back then, if you wanted to try out an application you heard about, there was only one way to do it – download the source code and compile it. I had a wad of software in /usr/local, including MySQL, the GIMP, Scilab, and a bunch of other stuff I’ve forgotten. There was no online distribution channel. Free software developers didn’t do packaging, there were no PPAs. If it didn’t come on the install CD, it needed compiling.
It wasn’t better. There were fewer of us. Linux had a name as a hobbyist’s toy for a reason. Those of us that there were had a certain minimum knowledge of our systems. You knew shell commands because there was no other way to do anything. Everyone knew about fstab and resolv.conf and ld.so.conf and compiling kernel modules, because you had to. And every time you installed software, you had the source code – right there on your computer. And you knew how to compile it.
I don’t know if I would ever made a patch for the GIMP if I didn’t have the source code and had already gone through the pain of compiling & installing all its dependencies. I doubt it very much. And yet that’s the situation that the vast majority of Linux users are in today – they have never compiled any software, or learned about the nuts & bolts of their OS, because they don’t have to.
I remember Nat Friedman talking about this in a presentation he made a few years ago about how to become a free software developer. Step 1 was “download some source code”, step 2 was “compile and install it”, step 3 was “Find something you want to change, and change it”. And I recall that Nat identified step 2 as the major stumbling block.
We have bred a generation of free software users who have never compiled software, and don’t particularly care to. Is it any wonder that recruitment of developers appears to be slowing, that prominent older projects are suffering something of a demographic crisis, with hoary old 30 year olds holding down the fort, with no young fiery whippersnappers coming up to relieve them?
Is this a problem? It seems like it to me – so I ask the question to a wider audience. With a second question: how can we get the hobbyist back into using free software, at least for some part of our community?
March 24, 2010
community, gnome
9 Comments
We’ve been running into some interesting issues with the GNOME census, which are causing us to twist our tiny brains to get useful results. I thought it might be interesting to share some of them.
- gnome.org – a large number of people commit with their gnome.org address (or src.gnome.org), but have also committed with a different address in the past. So many of you received our survey request twice or more (oops). gnome.org addresses pose another problem too – when attempting to identify a developer’s employer, gitdm uses domain name matching, and some many gainfully employed GNOME hackers use their gnome.org addresses to commit, that doesn’t work very well (oops). Finally, we have observed so far that the response rate among unpaid GNOME developers is much higher than the response rate among professional GNOME developers, which has made identifying employers for specific addresses even more difficult.
- ubuntu.com – Some Canonical developers commit with their gnome.org address, some with their canonical.com address, and others apparently use their ubuntu.com address. Some unpaid Ubuntu hackers & packagers also commit with ubuntu.com email addresses. So identifying the exact number of Canonical developers & Canonical upstream commits has proven very difficult.
- Time – many GNOME developers have changed employers at some point, or gone from being unpaid GNOME developers to paid GNOME developers, or changed companies through acquisition or merger. Old email addresses bounce. And yet, it’s the same person. Dealing with time has been one of our toughest challenges, and one where we still don’t have satisfactory answers.
- Self-identity – One of the issues we’ve had running the survey is that simple domain name pattern matching doesn’t tell the whole story. Does someone who works for Red Hat on packaging and then spends his evenings hacking his pet project count as a volunteer or a professional? We have noticed a significant number of people who commit with their professional email addresses and consider themselves volunteers on the GNOME project. For a problem which is already complicated enough, this adds further nuance to any quantitative statistics which result.
Thank you all to everyone who has taken the time to answer the 3 to 7 questions (depending on how you self-identify) in the survey – the data has been interesting, and has led us to question some of our preconceptions. To those of you who have not answered yet, let me assure you that the email we sent was not a spam, and Vanessa is doing a great job collating your answers and (in some cases) preparing follow-up questions.
Any insights which people might be able to give in elucidating the issues we’ve noticed are welcome! Please do leave comments.
March 17, 2010
community, freesoftware, gnome, guadec
9 Comments
I’ve been working on a project for the past few weeks, and it’s time to take the wraps off.
For as long as I’ve been involved in GNOME, we have been asked the same questions over and over again: How many GNOME developers are there? Which companies invest in GNOME, and how much? Where can I go for professional GNOME development services? And for as long as I’ve been involved in GNOME, the best answer that we can give is pretty hand-wavey – we talk about hundreds of developers, thousands of contributors, the advisory board, an ecosystem of expert independent companies, but we never do get around to putting meat on the bones.
I decided that we should do something about that, and so for the past few weeks, an intern called Vanessa has been working to help me dissect the underbelly of the GNOME project.
What is the GNOME Census?
We’re aiming to answer three questions as completely as we can:
- Who develops GNOME, and what do they work on? What does the GNOME developer community look like? How many GNOME developers are there?And how many contributors doing things other than development?
- What companies are investing in GNOME, and how? Are there modules where companies are co-operating, or have contributing companies been concentrating on disjoint parts of the project?
- Finally, if you’re a company looking for expert developers for custom GNOME development, where should you go? What does the commercial ecosystem around the GNOME project look like?
We’ve been using tools like gitdm, cvsanaly and artichow to get some nice quantitative data on modules in GNOME git and freedesktop.org repositories. We will be running a survey of GNOME developers, and doing one-on-one interviews with key people in the GNOME commercial ecosystem to go beyond the figures and get some qualitative information about future plans and priorities as well.
So why take on the project?
Well, it seemed like fun. Answering interesting questions is always challenging and interesting. And it also seemed useful – if people are always asking for this information, there must be a reason they want to know, right?
Financially, this is an investment. I am paying Vanessa to help with the study, and it is taking a lot of my time. I initially looked for a sponsor for the project, but reaction was tepid, no-one wanted to bear the full cost of the report, but everyone I spoke to agreed that it would be useful and they would definitely like to have a copy when it got done. So I hit on the following idea for funding the project:
When the report is eventually available, I will be selling some copies to recoup costs. When I have sold a sufficient number to cover the cost of the project, I plan to release the report under a Creative Commons license. Those who are eager to get the results and information sooner rather than later will subsidise the availability of the report for everyone. I have submitted a proposal for GUADEC to present the conclusions of the report, and I anticipate that it will be available under a free licence by then.
Who’s the target audience?
ISVs are interested in knowing how active projects are before committing resources. The GNOME Census will help reduce the uncertainty when choosing GNOME as a platform. GNOME distributors will be able to leverage this report to show the vibrancy, size, activity and commercial ecosystem around the GNOME platform. For companies who have been long-time investors in GNOME’s success, the census will give them well-deserved recognition, especially in areas where that investment has not been very end-user visible, but has had a huge effect on the quality of the user experience. Finally, for companies building software platforms on top of GNOME, and for companies in the GNOME commercial ecosystem, this report will allow swift identification of service providers with a high credibility level through their involvement in GNOME and the core developers who are working for them.
So what now?
We will be launching a survey this week asking GNOME developers who they work for, and whether they have worked for other companies previously – because of the widespread use of gnome.org email addresses in GNOME, unfortunately it has not always been easy to identify companies behind the people. We also want qualitative information on projects you work on, whether you work on GNOME in your free time, and more. We are be breaking down GNOME development by core platform, external dependencies, GNOME desktop, GNOME hosted applications and other GNOME applications. Vanessa will be sending out a very short survey to everyone who has committed to GNOME, and we need your help to make the census as useful as possible to the GNOME project.
Thanks for your help!
March 15, 2010
community, freesoftware, gnome, guadec, maemo
1 Comment
I just realised this morning that after a very long call for participation period, we’re now in the last week before the call for participation deadline for GUADEC – you should have proposals in by 23:59 UTC on March 20th to be eligible for selection (although a little birdie tells me that might get extended to the end of the weekend). Of course, I knew that the deadline was sometime in the end of March, but I didn’t realise that we’d gotten so far through the calendar!
So get your proposals in about all things GNOME, GNOME 3, GNOME Mobile, usability, accessibility, webability, open data, free services, scaling the community, developer tools, whatever – but get them in quick. It’s better to get a poor proposal in now & improve it next week than wait until next week to polish what you have now.
For guidelines on a good talk proposal, I really like the OSCON guidelines as a list of good dos & don’ts for conference proposals – in general, make the proposal (and your presentation, if accepted) not about you or your project, but about your audience and what they can do with your project – so clearly identify the target audience & why they would attend, and make the title short & action-based, rather than vague, weird or overly clever.
Good luck to teuf and his merry band evaluating all the proposals!
March 5, 2010
community, gnome
3 Comments
In honour of the recent discussions on foundation-list, I would like to resend everyone to this piece by Dan Spalding, which I’ve mentioned previously. It had a huge influence on me, and hopefully will on others too.
As a teaser, here’s an extract of the target audience:
Consensus decision making is a model of the society we want to live in, and a tool we use to get there. Men often dominate consensus at the expense of everyone else. Think about the man who…
- Speaks for a long, loud, first and often
- Offers his opinion immediately whenever someone makes a proposal, asks a question, or if there’s a lull in discussion
- Speaks with too much authority: “Actually, it’s like this…”
- Can’t amend a proposal or idea he disagrees with, but trashes it instead
- Makes faces every time someone says something he disagrees with
- Rephrases everything a woman says, as in, “I think what Mary was trying to say is…”
- Makes a proposal, then responds to each and every question and criticism of it – thus speaking as often as everyone else put together (Note: This man often ends up being the facilitator)
It’s rarely just one man who exhibits every problem trait. Instead it’s two or three competing to do all the above. But the result is the same: everyone who can’t (or won’t) compete on these terms – talking long, loud, first and often – gets drowned out.
This is a result of society’s programming. Almost no men can actually live up to our culture’s fucked up standards of masculinity. And our society has standards for women that are equally ridiculous. In one way, we both suffer equally. That’s why we all yearn and strive for a world where these standards – which serve to divide us and reduce us and prop up those in control – are destroyed.
In another way these standards serve those who come closest to living up to them. Sure, we all lose when a few men dominate a meeting. But it’s those men who get to make decisions, take credit for the work everyone does, and come out feeling more inspired and confident.
Like I said, Dan’s piece opened my eyes to my own bad behaviour, and also enabled me to improve as a meeting/round-table/discussion facilitator. Hopefully a reasoned reflective analysis of their behaviour by the most disruptive elements of foundation-list will also have a similar effect on them. I certainly hope so.
February 3, 2010
freesoftware, gnome, marketing
2 Comments
More and more we’re seeing organisations outside the free software world try to learn the lessons of our success, and integrate “open source” practices into their organisation.
Whether it’s companies adopting transparency and other cluetrain or pinko marketing strategies, proprietary software development companies integrating standard free software practices, or one of the other areas where “crowdsourcing” has become the cool new thing, it’s obvious hat we have gotten some things right, some of the time, and it is definitely worth learning the right lessons from projects like Linux, Mozilla, GNOME, or Wikipedia, and trying to reproduce the magic elsewhere.
Sometimes this feels like the cargo cults in the Pacific Islands, trying to make airplanes land as their ancestors saw 60 years ago, by building airstrips and imitation airplanes. But at least they’re trying to figure out what makes our communities successful.
But are we learning enough lessons from others? It seems to me like we’re charging head first like sharecroppers into undiscovered country, only to find that we’ve run into a highly advanced civilisation.
As developers, we’ve invented our own brand of everything, from scratch. We figure out how to run conferences, or raise money from people who like what we do, when these are not new problems.
This isn’t new in IT. The entire learned history of typography got thrown out the window more or less, because with the advent of WYSIWYG editors and the web, everyone has complete control of their authoring tools and Comic Sans is shipped by default, and if I need to reduce the margins to get the letter to fit on one page then by golly I will.
Merchandising and recruitment of new star talent are more examples of things that some other organisations are pretty good at.
So – as an open question – are we learning the lessons from the past which we should be learning, or is it too attractive to think that what we’re doing is so new that every problem we encounter needs a new solution?
One example of a place where there is a wealth of experience out there is convincing people to give money to a cause they believe in. There are dozens of organisations that do this well – humanitarian organisations, political lobbyists, political parties, universities – the list goes on.
Can we figure out how GNOME is like them, and learn the lessons from their fundraising campaigns?
A typical fundraising drive for an organisation like this has three main steps:
- Get a list of potential donors
- Convince them that you are doing good
- Find a pressure point or argument which will convince them to donate
If you look at a mailing for Médecins Sans Frontières for example, you see all of these points in action. Find potential donors – through sign-up campaigns, former donor drives, referrals. Send them a mail package, with a newsletter outlining good work, but with just enough bad news (new conflicts, new refugees, unfinished projects) and artwork (a smiling nurse taking care of a village vs a child ill from a curable illness) to show that money given to MSF will do good, and the need has never been greater.
Your response rate may be small – perhaps only 1% – but that’s enough.
Whether we’re talking about lobby groups, political parties or humanitarian agencies, the same strategies come into play – construct big databases of potential donors, and get them riled up about the thing they’re passionate about being endangered – show them the shining light of all the good work your organisation does, and then drive the sale home by making it really easy to give money or sign up.
University fundraising is an interesting case – and in fact, GNOME’s fundraising model ressembles it now. Your primary source of donations is alumni, people who have been through the university, like receiving updates every year, maybe a class-mate just became a professor, maybe a friend’s daughter got a prize in the annual awards ceremony, maybe a club or association you were in had a good year? And then you leverage the affection with the flip side of the coin – the need, the things we’d like to do better, the project we’re fundraising for which will allow us to do great work.
All of these organisations invest heavily in direct mailing, in building and maintaining databases of supporters, and in monetising them. I recently read a book by a direct mailing copywriter called “My First 40 Years in Junk Mail” and it opened my eyes to what works in that world – and also gave some ideas on the kinds of strategies maybe the GNOME Foundation should be adopting.
The first step is building and maintaining a list of GNOME fans and supporters, by any means possible, and ensuring that they are made aware of what we’re up to and what we’d like to do. And, of course, continuing to build great products.
December 24, 2009
community, freesoftware, gnome, maemo, marketing, running, work
No Comments
Looking back on 2009, I wrote quite a bit on here which I would like to keep and reference for the future.
This is a collection of my blog entries which gave, in my opinion, the most food for thought this year.
Free software business practice
Community dynamics and governance
Software licensing & other legal issues
Other general stuff
Happy Christmas everyone, and have a great 2010.
December 14, 2009
community, freesoftware, gnome
6 Comments
I’ve stayed quiet on this, but listened on the sidelines, for a while now. But the blogs I read today from Monty and Mneptok lead me to reply.
I was a long-time Sun shareholder (don’t laugh) but sold my shares as soon as the Oracle acquisition was announced. I was pretty ambivalent about the deal at the time, not really taking position on either side of the fence, and happy just to think about possibilities.
But the latest lobbying of the EU to try to stymie the deal has ticked me off.
MySQL, through their choice of licensing and business model, set the rules of the game. Sun bought MySQL for lots of money. It’s their property now. It is, as Michael Meeks said, very bad form for the guy who set up the rules to complain that they’re not fair now.
So what will the effect of Oracle’s purchase of Sun Microsystems be?
First, Oracle offered $7.4bn for Sun, while Sun (over)paid $1bn for MySQL at the beginning of 2008. That means, being generous, that MySQL makes up under 13% of Sun. And the other 87% no-one is worried about, apparently.
Second, Sun is haemorraging money. This is not surprising; any time a company offers to buy another company, all the existing customers who were planning purchases wait until the acquisition is finished. They want to know what product lines are being maintained, whether licensing, support or pricing conditions change. In short, it is expected that the revenues for a company between the moment an acquiqition is announced and the moment it is finalised go into the toilet.
Third, friends of mine work at Sun. I’m seeing them be miserable because they don’t know what role they have to play in the company. They don’t know if they’re going to have a job in a few months. And the chances of them having a job in a few months are inversely related to the amount of time until this acquisition is completed. Low employee morale during uncertainty is another inevitable consequence of the delay in the acquisition, and it’s one with longer term consequences for the health of the company than any short-term delayed purchase decisions.
The uncertainty is killing Sun, and it’s killing the projects that Sun owns – MySQL among them. One possible outcome of all of this is that Oracle come back with a lower offer price after this all shakes out, because frankly Sun is worth less, the deal falls through, and Sun as a company will be on life support.
I have read RMS’s letter to Neelie Kroes, and I respectfully disagree. The entire letter reads as an advocacy of dual licensing as the way to make money from a free software project – an astounding position given the signatories. To quote: “As only the original rights holder can sell commercial licenses, no new forked version of the code will have the ability to practice the parallel licensing approach, and will not easily generate the resources to support continued development of the MySQL platform.”
I had to check twice to ensure that the thing was indeed signed by Richard Matthew Stallman, and not someone else with the same initials.
MySQL is available under the GPL v2, a well understood licence. Oracle will be free to take it closed-source only. They will be free to change the licence (perhaps even to GPL v3). They will even be free to kill development of the project altogether. Does this put companies like Monty Program at a disadvantage compared to Oracle? Perhaps. Is that disadvantage insurmountable? Not at all. MariaDB and Drizzle have a great chance of succeeding in the same way MySQL did – by disrupting the database market.
The whole thing smells to me like a double standard – it’s OK to have certain licensing policies if you’re friendly, but not if you aren’t. Luis Villa set me straight on this point a few years back, and it has stuck with me: “what if the corporate winds change? […] At that point, all the community has is the license, and [the company’s] licensing choices”. You trust the license, and the licensing choices. And at this point, I’m more concerned about the jobs of the people working at Sun, and the future of Sun owned projects, than I am about what Oracle will or won’t do with/to MySQL.
November 24, 2009
community, freesoftware, gnome
11 Comments
It was with some trepidation that I plugged in an external monitor into my laptop today to test how Ubuntu 9.10 handles external displays. In my last three upgrades the behaviour has changed and l’ve ended up on more than one occasion in front of a group telling them I’d get started in just a minute…
But yesterday, when I plugged in an external CRT monitor to see how things would react ahead of a training course I was giving this morning, I was pleasantly surprised! The new screen was automatically added to the right side of my existing screen to make a large virtual desktop. When I opened display preferences, mirroring the screens worked perfectly. When I unplugged the CRT, the desktop degraded gracefully – nothing froze or crashed, I didn’t get a reboot, and all the applications which were displaying on the external screen were seamlessly displayed on my laptop display. Bliss! Everything worked just as I expected it to.
So kudos to the Ubuntu integrators, and the Xorg and GNOME developers, and especially to the developers working on the Intel X drivers, for making me smile yesterday. You have given me hope that this year I will attend at least one tech conference where no Linux user has trouble with the overhead projector.
Update: I meant Karmic Koala, Ubuntu 9.10, not Jaunty. Thanks to Marius Gedimas for pointing that out.
September 17, 2009
community, freesoftware, General, gimp, gnome, maemo, work
5 Comments
(Reposted from Neary Consulting)
Mal Minhas of the LiMo Foundation announced and presented a white paper at OSiM World called “Mobile Open Source Economic Analysis” (PDF link). Mal argues that by forking off a version of a free software component to adjust it to your needs, run intensive QA, and ship it in a device (a process which can take up to 2 years), you are leaving money on the table, by way of what he calls “unleveraged potential” – you don’t benefit from all of the features and bug fixes which have gone into the software since you forked off it.
While this is true, it is also not the whole story. Trying to build a rock-solid software platform on shifting sands is not easy. Many projects do not commit to regular stable releases of their software. In the not too distant past, the FFMpeg project, universally shipped in Linux distributions, had never had a stable or unstable release. The GIMP went from version 1.2.0 in December 1999 to 2.0.0 in March 2004 in unstable mode, with only bug-fix releases on the 1.2 series.
In these circumstances, getting both the stability your customers need, and the latest & greatest features, is not easy. Time-based releases, pioneered by the GNOME project in 2001, and now almost universally followed by major free software projects, mitigate this. They give you periodic sync points where you can get software which meets a certain standard of feature stability and robustness. But no software release is bug-free, and this is true for both free and proprietary software. In the Mythical Man-Month, Fred Brooks described the difficulties of system integration, and estimated that 25% of the time in a project would be spent integrating and testing relationships between components which had already been planned, written and debugged. Building a system or a Linux distribution, then, takes a lot longer than just throwing the latest stable version of every project together and hoping it all works.
By participating actively in the QA process of the project leading up to the release, and by maintaining automated test suites and continuous integration, you can mitigate the effects of both the shifting sands of unstable development versions and reduce the integration overhead once you have a stable release. At some stage, you must draw a line in the sand, and start preparing for a release. In the GNOME project, we have a progressive freezing of modules, progressively freezing the API & ABI of the platform, the features to be included in existing modules, new module proposals, strings and user interface changes, before finally we have a complete code freeze pre-release. Similarly, distributors decide early what versions of components they will include on their platforms, and while occasional slippages may be tolerated, moving to a new major version of a major component of the platform would cause integration testing to return more or less to zero – the overhead is enormous.
The difficulty, then, is what to do once this line is drawn. Serious bugs will be fixed in the stable branch, and they can be merged into your platform easily. But what about features you develop to solve problems specific to your device? Typically, free software projects expect new features to be built and tested on the unstable branch, but you are building your platform on the stable version. You have three choices at this point, none pleasant – never merge, merge later, or merge now:
- Develop the feature you want on your copy of the stable branch, resulting in a delta which will be unique to your code-base, which you will have to maintain separately forever. In addition, if you want to benefit from the features and bug fixes added to later versions of the component, you will incur the cost of merging your changes into the latest version, a non-negigible amount of time.
- Once you have released your product and your team has more time, propose the features you have worked on piecemeal to the upstream project, for inclusion in the next stable version. This solution has many issues:
- If the period is long enough, your feature additions will be long removed from the codebase as it has evolved, and merging your changes into the latest unstable tree will be a major task
- You may be redundantly solving problems that the community has already addressed, in a different or incompatible way.
- Feature requests may need substantial re-writing to meet community standards. This problem is doubly so if you have not consulted the community before developing the feature, to see how it might best be integrated.
- In the worst case, you may have built a lot of software on an API which is only present in your copy of the component’s source tree, and if your features are rejected, you are stuck maintaining the component, or re-writing substantial amounts of code to work with upstream.
- Develop your feature on the unstable branch of the project, submit it for inclusion (with the overhead that implies), and back-port the feature to your stable branch once included. This guarantees a smaller delta from the next stable version to your branch, and ensures you work gets upstream as soon as possible, but adds a time & labour overhead to the creation of your software platform
In all of these situations there is a cost. The time & effort of developing software within the community and back-porting, the maintenance cost (and related unleveraged potential) to maintaining your own branch of a major component, and the huge cost of integrating a large delta back to the community-maintained version many months after the code has been written.
Intuitively, it feels like the long-term cheapest solution is to develop, where possible, features in the community-maintained unstable branch, and back-port them to your stable tree when you are finished. While this might be nice in an ideal world, feature proposals have taken literally years to get to the point where they have been accepted into the Linux kernel, and you have a product to ship – sometimes the only choice you have is to maintain the feature yourself out-of-tree, as Robert Love did for over a year with inotify.
While addressing the raw value of the code produced by the community in the interim, Mal does not quantify the costs associated with these options. Indeed, it is difficult to do so. In some cases, there is not only a cost in terms of time & effort, but also in terms of goodwill and standing of your engineers within the community – this is the type of cost which it is very hard to put a dollar value on. I would like to see a way to do so, though, and I think that it would be possible to quantify, for example, the community overhead (as a mean) by looking at the average time for patch acceptance and/or number of lines modified from intial proposal to final mainline merge.
Anyone have any other thoughts on ways you could measure the cost of maintaining a big diff, or the cost of merging a lot of code?
« Previous Entries Next Entries »