Ekiga is slowly coming

January 26, 2008

Well, more precisely : it’s slowly coming from my end : other areas are progressing pretty nicely, with interesting new features soon to come.

Most of the main structures are in place, and the work is now to build little code stubs to link all of it : it should now be possible to fix some long-standing issues like the video preview still running while out-of-call and hidden, or inhibiting the screen saver during a call and reenabling afterwards… it should also be possible to re-enable some previously working features like avahi presence publishing.

There are still important main structures which needs some engine-ing, like accounts or the text chat. I already have code for the later, but need to rework it in light of recent changes I made elsewhere : the code should be much shorter now I factored out the needed features…

Things would happen faster if I didn’t happen to care about my students…

Cooling the heated discussion

November 27, 2007

I’ll just try to make no link to others’ blogs, and cite no name.

First of all : an election is a time to discuss. And discussion can get heated. Don’t ask people to shut up or retract. Democracy is all about letting people speak…

Contrary to what someone proposed, fists are for beasts. This was a very wrong proposition!

Insults are wrong and rude. Even if they end by “:” and an explanation why.

I really don’t like the form of the starting post, but I agree with the point.

I have contributed to ekiga since a long time. I have read planet gnome since a long time. Most of my open source work was behind the scene, playing with the code, I was only visible on #ekiga and #gnomefr.

Then earlier this year (as a teacher I count years from august-september…), I decided it would probably worth it making myself more publicly known. I added #gnome-hackers to my auto-joined channels. I decided to request an @gnome.org mail address, created a blog on blogs.gnome.org… and tried to get added to the planet to join the choir (end of august).

And this is where things were tricky : how does one get added to that community planet? One name. Let’s ask. No answer. Let’s try to get him on irc. Hard : the earth isn’t flat and we don’t live on the same side. I’m told my blog is just opened, I should blog some time, then I’ll get added. One the one hand, it makes sense — on the other while the blog is new, I am not.

Refresh planet.gnome.org next morning : eh, some guy just got added : he just blogged once six months earlier and his second post, “Thanks” is on the planet. And he doesn’t seem to be a long-time gnomer.

I’m not saying adding him was wrong : being a welcoming community is important. But
why does a oldie like me have to go through a longer process?

Ooohhh… someone complains about planet.gnome on the foundation list… the board takes notice of the problem, puts out a politically-correct statement : things will move. My blog gets added end of october.

Did I mention my first post on the foundation-list was to complain loudly about proposing to extend the term of the *current* board by six months?

I get the impression that the community is broken in two camps (“broken” fits) :

  • friends, which don’t see the problem, because for them the answers are always fast and timely ;
  • the others, which do see the problem

That being said, I do want to acknowledge that we want to get help from everyone. Kicking someone out is just wrong.

There isn’t just a person problem : there’s an organisation problem. Nobody should have been able to get a nepotist grip on anything in the first place. Hell, would I have done better !?

Years ago, when Damien decided to give me access to the then-cvs of then-gnomemeeting, it took a very long time. This very year, when Damien & myself decided Mathias needed access to the now-svn of now-ekiga, it took a very long time during which he had to go through us to get his work in — I so feared we would lose him on that!

To have a blog on blogs.gnome.org, you first need to get an @gnome.org alias, for which you first have to … for which you first have to… In case you don’t know, we french coined the “bureaucratie” word which got “bureaucracy” the other side of the Channel and beyond : I can tell you for sure this is one!

On the other hand, something didn’t look right in wormux. I got the svn sources, made a patch, uploaded it to gna! For this, I had to create an account there : five minutes. The developpers committed, but kept the bug open because there were other similar things that could be done elsewhere. Eh, I know how to do that! Second patch. I get proposed an svn account to commit myself [Yes, after only two patches, but they were both ready-to-commit, and I was known as an ekiga developper already : that helps 😉 ] (discussion translated from french) :

  • Eh guys, you should commit right now, the patch will get rotten before I…
  • If you upload your ssh key, then you’re one hour away from committing — sorry, it’s a cron we can’t go faster.
  • !!!!!!!!!

I committed within an hour after I uploaded my ssh key. Gasp.

I hope I managed to write something higher than some of the things I have read on the gnome planet those last hours!

OCaml, eclipse, ekiga…

November 17, 2007

Last time I blogged about how python saved my day, but I didn’t really enjoy the experience : by itself, the language will discover huge mistakes only when hitting the right code path. Yes, I know : “Use unit testing!”. Right. For me unit testing is about checking corner cases are handled correctly, not about checking the code even makes sense. It’s about making something 90% correct go 99.999% correct.

So I turned again to a stronger language, one which will tell me exactly how wrong my code is (and where and why) the minute I write it : ocaml.

Of course, as I said last time, it lacks decent XML support : it does have simple parsers, but nothing serious : if you load an XML file, you can’t really edit it in-place. You have to save by yourself, which means if you parse something like <foo detail='bar'>...</foo>… but only know about the foo tag without detail, then you will save without the unsupported attribute and same problem if there’s an unsupported child.

Still, as I have hope that things will improve (well, I may even have a hand in it), I went forward.

I even discovered OcaIDE, an eclipse plugin for OCaml, which works pretty well with gcj, but with a pretty annoying performance issue on the completion — which disappears with sun’s virtual machine (problem reported).

Together with the new ocamlbuild tool, which takes care of most of the dependency tracking, things went fast : I have almost duplicated the python features I already had. A little because I factored things a little differently, but mostly because I was discovering eclipse, ocamlide and ocamlbuild (with python I was just discovering the specific modules I was using : I knew emacs already, and the module system).

Of course, things would go faster if I stopped requesting assignments from my students every week… which means reading and annotating like crazy.

I also brought back to life the skeleton of ekiga’s future call history, which had been compiled out since a few weeks… and I learnt I would have to rewrite the evolution-data-server at least partially. Sigh.

My last post about the sorry state of my workflow triggered a few answers :
many thanks!

The most interesting was to use a database : after all, organizing data is
the whole point of databases, doesn’t it?

So I looked around for what is available to make my life easy, and found
glom (well, I had already read a few blog posts about it on planet gnome, but
never dug into it).

It turns out it’s not in debian (and probably won’t anytime soon : there is a
package in the works since so long…), and that its dependencies aren’t up to
date anyway.

So I chased them one by one, using the sources and debian’s .diff.gz to get
updated packages (I hate using ‘make install’) — as an aside, that reminded
me a lot about the early days of gnome. Sigh. I was so patient at that time.

Finally, I could compile and install glom (using ubuntu’s .diff.gz to get nice
packages this time, since debian lacked the equivalent). It even ran, but it
was a little too unstable for my taste : starting and stopping the database
gave issues, it didn’t like my passwords, for example [yes, I reported].

These stability issues mean I can’t use it for work, but I’m still pretty
impressed by what I saw, and I’ll have to find a problem to solve with. If
said problem doesn’t exist, I’ll just invent it 😉 .

Using a database directly didn’t look as promising, so I left that option out.

Embedding the tags into my files directly definitely is out too : it’s gross,
and would assume I only have text files, where it’s easy to embed things. This
is the case with texmacs files, as are most of the things I use (giac/xcas,
maxima, gnuplot, C, C++, …), but what if I want to associate an image to an
exercise? or anything binary-based where embedding text and using grep isn’t that easy?

So I went for the solution where I store the metadata separately, in helper
files. And I stored the metadata as XML files, since that’s both easy to use, and extensible.

At first, I tried using C+libxml2. The first tests went pretty well, and things looked
pretty promising. Then I typed something like :
$ ./exo-tag-create “Réduction d’endomorphismes”
12
$ ./exo-tag-text 12
<partial garbage>

I tried to use a few magical functions (g_locale_to_utf8, for instance) to
sanitize my strings before I feed them to libxml2, but couldn’t get things
right. I just hate when a stupid little thing completely unrelated to the
problem at hand gets in the way — I didn’t expect any issue since my env has
“LANG=fr_FR.UTF-8″… so things should have just worked!

When a problem doesn’t surrender fast enough using an approach, the best is
generally to try another one. I dug to see if ocaml had gained a decent XML
lib since last time I searched for one… unfortunately, no (such a shame for
such a beautiful and powerful language!), and I didn’t want to dust off the code
I wrote last year for babili.

Next try : apt-cache search python | grep xml
The debian description of python-lxml mentioned it used unicode strings, so I
settled on it, and quickly tried to build a pair of scripts to test if my
little problem still applied : no. Perfect!

So I wrote a first series of scripts to manage tags and then another to
manage items. Since there was too much code duplication, I refactored things.
Then I tried to write gtk+ helpers using the refactored code : easy. Then little
scripts to split my existing texmacs files into chunks, and join chunks into
a new document. I don’t have all I need yet, but it takes a pretty exciting
shape already.

Python saved me. It made things pretty easy, but I can’t help to
dislike it nonetheless : that language is too weakly typed for my taste (even
if using pychecker helps to alleviate this). I really prefer more serious
languages like eiffel or ocaml (the later leads to terser code : type
inference rules!).

In a few months (or years, if necessary), I’ll be able to push my data to a nice
database. And if so, I really look forward to using glom.

Past : itched by gnomemeeting

Years ago, I was discovering a wonderful program : gnomemeeting. It already made it possible to make calls with both audio and video (that was years ago and some projects still struggle to do things half as good!). And I happened to have two video devices available : a USB webcam, and a IEEE1394/FireWire/iLink/whatever-the-vendor-felt-would-sell-more DV camcoder.

And this is where things got tricky. Pwlib, the library which gnomemeeting used to handle devices had a PVideoInputDevice class (it’s C++), and several implementations of it — moreover it had implementations for both my devices! That was really a feat :

  1. at the time the kernel had very, very poor device support for video — but both had a driver
  2. someone else already added support to pwlib!

The big issue was that switching from one device to another required recompiling the whole of pwlib with a different option. Ouch.

So here was the itch. And this one turned me from a libre software user to a libre software developper : I decided I would make it possible to switch them at runtime — which mostly meant changing the structure from a #ifdef tree to a nice class tree, with dynamic loading and a simple listing ability. Of course, C++ made it a pain because I had to track each and every method to add the “virtual” keyword all over, but I’ll rant about C++ another day (and perhaps say also a good thing or two about it — three if Dodji hits me hard enough).

Present : pretty good, but still

This initial contribution prompted other contributions, which triggered other contributions, which… This is why ekiga (gnomemeeting’s new name) has a very good device support these days, through the use of plugins, which exist for various platforms (GNU, *BSD, win32, …).

But those plugins themselves are all based directly on platform-specific API. We have ALSA, OSS, V4L, V4L2, AVC and DC, just to name those which work on GNU/Linux based systems. Whenever some incompatibility happens, we have to fix it ourselves for ourselves : it’s all about us.

And this is why I’m not satisfied. I would rather see pwlib have support for a single portable audio and/or video framework, which would be shared by as many projects as possible. That would have huge advantages : first the user interface would be simpler (we currently make the user choose among a list of badly-named managers before they choose the device) and then both problems and solutions would be shared on many shoulders.

Notice that this isn’t pwlib bashing : the current organisation made sense, since there was no such framework at that time. Things are different now : there are different choices — and possibly we can take several of them at first before picking the most suitable.

Future : choices

Using portaudio

Portaudio is a nice portable audio framework — we are almost using it already. The positive part of ‘almost’ is that we do have a plugin. The negative part of ‘almost’ is that it doesn’t work correctly yet. Let me admit that I’m a little stuck. 🙂

Using GStreamer

This one I don’t think I need to link to here : all must have heard about it already. This is a wonderful framework. Does audio and video. Will probably do what ekiga needs network-wise at some point in the future. Unfortunately, it is wonderful as long as you manage all of your data with it — things get more difficult if you just want to use it partly.

Let me get into more details here : my original plan was to write device plugins for pwlib, then as time go by use it more and more up the data streams until it’s everywhere. Yes, that framework makes me giggle, so I would like it used more.

But the problem I hit is that to do that, you need to do some plumbering between the GStreamer framework and pwlib’s own stream framework. And this isn’t easy. I do have some experimental code, both for audio and video, but they’re hackish ; the real solution would be to fix this bug, which is precisely about having bridge elements to allow plumbering (not just to pwlib, but to whatever other framework you may want to throw at it).

Of course, I didn’t just report issues, I also tried to report with minimal examples showing the issues, and even some patches. Eh, the goal is to share after all. Both code and problems!

Conclusion

So as you can see, contributing to device support is a very good way to start hacking on a project : if anyone wants to lend a hand — please do!

And don’t think you’ll always be stuck with devices : it’s just that you’ll always come back to them 😉

Part of my work consists of having loads of exercises, and being able to come up with short lists of them which fit some criteria ; like “can use numeric series, cannot use function series, can use topology, cannot use endomorphism reduction”…

My current work organization is to have a few lists of those, organized in chapters, and hand-picking the matching ones from there. It does work, but is quite long and painful.

So I’m looking for some kind of tagging system to help me with it.

The first solution I came up with was : put each exercise(+optional solution[s]) in a little file and try to use emblems in a file browser. Unfortunately, that meant :

  • a huge number of files to create
  • create emblems
  • right-click each of the numerous files to add the right emblems
  • I don’t know exactly how to search on multiple criteria
  • how does one backup this?!

A better solution I’m considering is : put each exercise(+optional solution[s]) in a little file, then create a structured file to associate tags to each file. If I go for the “filename tag1 … tagN” solution, then searching is just a grep away. Combine with some awk-fu and some scripting and I can probably automate the creation of a document with all exercises. Backup is easy. Pending issues:

  • still requires creating a huge lot of files
  • in which format do I store the exercises(+optional solution[s]) so I can easily automate the merging?
  • in which format do I write the exercises(+optional solution[s]) so it’s easy to write&edit afterwards?

I’m currently using the very very good texmacs to write all my documents, so it would be nice to be able to do something with it. It will be even better if I manage to get two documents : one with the questions for the students, and one with solutions for my poor head…

Some of you may have noticed that since I had bought a new laptop, that meant that I either accepted a set of software licenses, or refused and asked for a refund.

In fact, I even asked for a refund even before I bought. And since that wasn’t enough to make them bulge, I made it clear to the selling company that I really meant it with a nice official (paper) letter afterwards, giving them 15 days to comply.
A nice mail thread followed, during which they could propose either 20€ and nothing on the bill, or go through ASUS and get 25€, but only if I sent them the laptop (paying forth and back trip!) and saw the warranty voided. For some reason, I met neither with enthusiasm.

When the 15 days were gone, I sent them a mail to tell them that probably meant a trial, but still wrote to an official organization (the DGCCRF) as a last chance for them.
They were suddenly able to make the following proposition : 20€, with a line in the bill. Which leaves me wondering if they’re selling at a loss, since it’s 25€ through ASUS. But at least, they finally acknowledged they had to detail their bills.

I don’t know what I am going to do now, but things seem to take a better shape somehow.

I spent hours trying to figure out what my students were trying to prove — I knew the questions, but the answers made little sense.

The problem is that they’re focused on doing computations, and for that they use letters, letters, letters… none of them they introduce before use!

Worse, sometimes they will fix, say $x>0$, then later on consider a function $x\mapsto f(x)$… which means a collision!

Even worse, since they’re just toying with letters, they don’t have issues applying a function defined on reals numbers to a complex number or a high-dimensional matrix…

So my current job is to beat some sense into them and have them define things before use, and define as little as possible. This way, when they’ll have the real, serious test at the end of the year, they won’t have ridiculously low marks just because their proofs don’t look like proofs.

If only they had more computer science to do with me, I could try to push them in front of languages which won’t let them use variables without declarations, won’t let them apply operations if they don’t make sense, and won’t let them reuse the same variable in the scope of another!

But I only have them for a little maple work (sigh… unfortunately, xcas or maxima aren’t an option for them), and they have a very hard time just writing a simple function to loop over an array, so I can hardly make use of a compiler to suffer instead of me : I’ll have to do all the complaining myself.

Since my laptop needed replacement, I had to study what was available on the market… It took a while to ponder what was good and wasn’t, and try to adequate what I wanted and what I could afford.

I finally settled on a Dell Inspiron 1520, with some options. But when I placed my order, the 24th of august, I was rather unpleased to discover that the estimated date of delivery was the 5th… of october!

I waited a little, asking why it took them so long. The answer was : “Worldwide screen shortage”. Ouch! Of course, a little poking around made the truth more apparent : Dell suffered from shortage on the global scale, but there was no real global shortage : the rest of the world was going on ok. Two weeks gone, and the very very interesting configuration was already only very interesting… and still three weeks away : I cancelled.

The hunt began anew, and this time I settled on an ASUS F3SV-AK143C. It took less than a week to get it, and it took so long mostly because the delivery made it do a little tourism (for those who know how to use a map : it was shipped from Écully near Lyon, going to Crolles near Grenoble, and went there through Villeneuve-la-Garenne near Paris!).

Of course, I immediately tried to install a sane operating system on it : debian unstable. And since I had already set up one of my boxes (goedel) as PXE server (see my post about it), I thought I could use that to kickstart the installation.

Failure! No DHCP answer. In such a case, there’s no need to think long : first I made sure the cable was correctly plugged on both ends, then checked the dhcpd configuration file. The cable was well plugged, but the dhcpd had the mac address from my (now) old laptop (hilbert) : a slight modification, a dhcpd restart and I could reboot the new laptop (noether) to see if things went
better.

Success! I was now greeted by the new debian-installer splash, and can choose my language, and keyboard setup. In one hour I would have all setup!

“Sorry, your only network interface seems to be firewire. Do you want me to hurt you?”. Well, it wasn’t precisely worded like this, but I can swear it meant that! Let’s forget the one hour setup… sigh.

Of course the issue is that it’s a very recent computer, and the kernel is linux 2.6.18. Bad. I tried to stick a more recent kernel in the initrd provided by goedel, and rebooted noether with it.

Worse. Now it seems it doesn’t recognize the harddisk correctly (something about waiting for a device to show up).

Ok, so I’m back to 2.6.18 which just lacks network. Hmmm… the card, after some searching, is an attantis l1, supported from 2.6.21 onward. After some chasing I find an atl1.ko for this kernel (already compiled — the driver is GPL and the sources are readily available, but unfortunately, I don’t have a complete compilation suite with kernel headers on a minimal installation!).

Now the game is to find out how to add it into initrd.gz. Turns out unpacking it is pretty easy with cpio, but uncareful repacking won’t give good results ; I found the magical command line on the web : find . | cpio -o –format=’newc’ | gzip -9 > initrd.gz

Better! After the installer complained about the network card, I went in the next console and after a modprobe, asked it to reconsider its failure : this time it worked.

The base install went smooth ; after that, it was time to restart on the harddisk. But before rebooting (as I’m a little paranoid) I had the idea to copy my atl1.ko on the new system. I wanted to be sure I would have a working network card, and since atl1.ko wasn’t there already, it wouldn’t have! I was so happy to avoid that issue!

Restart, no network, but no problem : a modprobe and I’d be set. Modprobe complained that the running kernel and the module didn’t have the same size for a struct : binary incompatibility! The module was compiled for 2.6.18-4-486, while the running kernel was 2.6.18-5-686. Rrraaahhh…

So I finally gave up and decided to use an installation CD — but one with an up to date kernel (see here). It took a while to download and burn (normal), then took a huge while to start to boot (strange!), then load the kernel&ramdisk(strange!). It took another huge while to check what it had available on the cdrom (strange!), and finally failed with an error (definitely bad!). The dmesg output was accusing me to have put the CD upside down, and various other dirty words : uh, is this drive toast? Since the BIOS seemed to have a hard time booting on it in the first place, that seems to rule the driver bug out in favor of a hardware problem…

Ok, at that point, since I don’t have a USB key (something I’ll probably change after two terrible afternoons trying to get laptops running), since I can’t trust the DVD driver… I have no other choice than making the network go.

I cracked open the initrd that goedel was giving to noether on bootup, added itself, its kernel and corresponding modules to the directory, repacked, and rebooted noether with it : this way I was able to put a known working kernel+initrd+modules on the harddisk. I just had to reboot on the harddisk (so no network), put things in place, ask lilo to use that, and I was saved.

Upgrading to unstable was easy, with the still pending problem of grub : it doesn’t want to install itself (but is polite enough to do so without breaking the lilo installation!).

I completed the install by pulling can’t-live-without packages (wesnoth, nethack, wormux, freedroids RPG, widelands, etc), can’t-work-without packages (texmacs, pari-gp, maxima, gnumeric, etc), can’t-use-without packages (gcc, ekiga, xchat, gajim, iceweasel, icedove, etc) and
don’t-know-how-to-categorize packages (xorg, kino, gimp, nemiver, gnome&xfce, vlc, etc).

Since I had X running (nv driver) and vlc, I tested my DVD drive with a few things (films of course, but also the DVD I always use for work — the only one I really care for) : all work. So it confirms it’s indeed an hardware issue.

I’ll install&use the nvidia driver when the xserver-xorg and nvidia-glx packages will be concurrently installable : there was an ABI change in Xorg, and of course the proprietary code couldn’t be updated. This will allow me to gather information for the nouveau project, and I’ll see how I can help them get full support for the card — it may take long, though.

g++ 4.2 : nice!

September 13, 2007

Eh, I was hunting bugs with my favorite unnamed editor today, when I was savagely attacked by a big pile of warnings I hadn’t seen before. Apparently, the g++ I have (4.2.1) doesn’t like when one gives constant strings to functions which declare they need a char* — a const gchar * wouldn’t trigger the warning. And this is so right!

So here I am, fixing ekiga’s internal apis here and there, basically adding ‘const’ keywords which should really have been there already.

Our code will be that much cleaner 🙂