A late devX hackfest report, gestures

Last week the developer experience hackfest took place in Berlin. Quite some blogging has happened during and since, nonetheless this was a pretty good excuse to blow the cobwebs off my blog and chime in too :).

The event was quite intense, plenty of discussion happened around the GNOME platform, developer tools, API documentation generation, and introspection. My main point of interest there was GTK+, it was great to see a roadmap shape up, and a consensus about the gestures branch (more about that below). I also offered to look into doing something that could replace GtkNotebook in the future, which I started hacking on during the hackfest. I also had a nice opportunity to chat with Philip about Freebase support in libgdata, and solve a few questions around Tracker.

Other than that, nice chatting, and a very fine choice of places for food and beverages. Many thanks to Chris Kühl and the Endocode guys for making this as much enjoyable!

Gestures

One point in the GTK+ meeting agenda was gesture support. The gestures branch (which has had a somewhat intermitent history during the last ~2.5 years) has lately shaped up into something that was agreed mergeable for 3.14. This branch introduces “gesture” objects that abstract away all the intricacies of (multi)touch management. These objects handle events, tracking individual touches, when gestures enters in a “recognized” state, high-level signals will be emitted. This has been made to work parallelly to regular event delivery, in order to allow for different degrees of interoperation with handlers there, and making single-touch gestures to work with pointer events is just a call away.

There is of course a need of cooperation between gestures, both within a widget, and across the widget “stack” that’s receiving events for each individual touch, each of those will be triggering a number of gesture objects as the touchpoint moves. To cater for that, gestures have a none/claimed/denied state on each tracked touch, and changes in those are communicated across the stack (possibly causing cancellation on child widgets also handling the touchpoint, or cancelling other same-level gestures that previously claimed the sequence).

Intra-widget, some simultaneity may be wanted when handling touchpoints (eg. long press and nth-click handling), gestures may be grouped so those share the same state and can act upon the same events, a widget can have multiple mutually-exclusive gesture groups (eg. scrolling and “switch page” panning on a scrolledwindow).

So that is it! in a very tiny nutshell, the API added in the branch is documented, that will be a more accurate source :). Now, a video showcasing this, unedited because Pitivi didn’t get along today. A 42 seconds journey from well-known lands to barren plains of insanity, made manageable thanks to this branch:

The branches for eog and evince are available too, for anyone willing to play soon with this. But as said, hopefully everything will be soon in master!