Wayland ♡ drawing tablets

So this is finally happening. The result of much work all through the stack from several dedicated individuals (You know who you are!) started lining up during the past few months and now is hitting master. Early in the cycle I blogged about stylus support being merged, based on the first version of the tablet protocols. Now I have the pleasure to declare GTK+ tablet support on Wayland feature complete.

As predicted in the earlier post, a second version of the protocol came through, bringing pad support for clients. What is a pad? It’s the set of buttons and other (often) tactile sensors that tablets have around the stylus-sensitive area. These devices are rather uncanny from an input management perspective: unlike mice/keyboard, those buttons/sensors don’t have an associated action that has been chiseled on the key or through decades of user interaction paradigms, they are rather meant to be user-mappable. Also, despite buttons being buttons and sensors being essentially one-dimensional axes, all resemblance with a mouse/stylus is purely coincidental, focus management is more similar to keyboards (actually, the same), and is not directly related to the stylus.

In GNOME, this action-mapping has been traditionally done in gnome-settings-daemon. X11 clients have been usually completely unaware of pad events, partly because of the oddities pointed out above. So g-s-d kept a passive grab on pad buttons and would translate those into keycombos. This has many shortcomings though, keycombos are far from standard, accelerators are translatable, … For Wayland, we have the opportunity to fix all these shortcomings, and make pads a first class citizen.

So the right thing to do here, if we want to univocally map actions to pad features, is delegating the action mapping to the client. The other side of the coin is providing proper feedback about the actions. In GNOME we have session-wide OSDs to display the pad mapping, and the wayland protocol has been tailored to cover an usecase like this, so applications can directly participate in filling in this info, this is how the end result looks:

Screenshot* Note to supervillains: Actions are stubs

There’s also changes scheduled (I know I’m late, who doesn’t like pressure!) to gnome-control-center to centralize the management of styli for all known/plugged tablets, following the very nice mockups from Jimmac.

How is this exposed in GTK+?

gtk+ master introduced GtkPadController, it’s a GdkEventController subclass that will manage pad events, from one pad specifically, or for all at once. The controller takes a GActionGroup and a set of pad action entries, each defining an action name to activate, a simple example:

controller = gtk_pad_controller_new (GTK_WINDOW (app_window),
                                     G_ACTION_GROUP (app_window),
                                     pad_device /* May be NULL for all */);
gtk_pad_controller_set_action (controller,
                               1, /* Second button, they're 0-indexed */
                               -1, /* All modes */
                               _("Frobnicate file"),

And pressing that button on that pad will trigger the related action from the GActionGroup. The given label is what you end up seeing in the OSD. I expect this object to be eventually made useful on other platforms than Wayland (X11 is the first objective), although only the event-to-action mapping, and not entirely exempt of platform-specific issues.

This is going to be messy! why does every app need configuring? Why not a global action map?

The Wayland protocol aims to be intemporal, the trouble would only begin at defining a good enough initial set of global actions, and still not all actions might make sense for every app. This allows every application to implement pad actions, no matter how high or low level they are. Configurability is also optional, although the first apps that will surely come to your head will implement some, they already did for styli and other input features.

Is this for me?

Depends :), if you see your app might be involved in artists or designers’ workflow, it well could. Providing a sensible minimal set of actions to perform, even if hardcoded (think those you’d like shortcuts for, and were away from the keyboard), could help lots in integrating this previously neglected device, to the point that it might feel widely useful besides the niche drawing application case. By using GAction underneath, there’s also little left for you to test, you just need to ensure the action performs the intended effect when activated.

GTK+ Wayland tablet support is merged

As for excuses go for blowing the lint off my blog, this is a pretty good one :). Wayland tablet support is something that got really close to being merged in 3.20, but the timing didn’t pan out in the end, wayland-protocols 1.3 included a tablet v1 protocol that went unused, till now. Now that we’re in 3.22, those bits are at last being merged. First came gtk+, which you can see working in these videos:

This improved tablet support brings some goodies (also on X11), there is a new GdkDeviceTool object that can used to track specific physical tools and provides the necessary info to do so in a persistent way (eg. across runs/reboots).

A note to all GTK+ app/widget developers

TL;DR: If you’re keeping track of motion events somewhere in your app, you want to check gdk_event_get_device(), see the bullet points below and check the best strategy for you.

One particularity in wayland tablet management is that whenever you have a tool in proximity to a tablet you’ll be driving a standalone onscreen pointer cursor, as opposed to the default behavior in X11 consisting of driving the same pointer than your mouse does. This difference is not moot, it doesn’t take too much thinking to realize that the user of a drawing tablet has a spare hand which could be using the mouse, so there’s in practical terms two pointers to react to. And there’s also setups where more than one drawing tablet is likely (the simplest one being a laptop with an integrated tablet plus a external one).

GTK+ is itself quite well prepared for these situations, as are all its widgets, it can’t account for widgets implementations elsewhere and signal callbacks in apps though.

But I’m not doing a drawing app, should I care?

Yes you should. Although tablets are mostly thought out for drawing, design, etc., ideally they should be able to drive the whole desktop without oddities, just as your mouse does. This can only drive to a pleasant experience with your app regardless of the input devices of choice.

If there’s somewhere in your app where you’re tracking pointer events over time (drawing is obviously an example, but so is rubberband selection, any kind of drag-to-move, etc…), there’s a chance that it could get confused by events with different gdk_event_get_device()s coming to the widget in question. There’s several things you might do:

  • First and foremost, look into GtkGesture and its several subclasses, it’s handled transparently there, and there’s a good chance that one of those (or a group) will fit your case
  • Maybe it’s sufficient a “first to come gets served” basis for the specific purposes of your operation? You can use gtk_device_grab_add/remove with that device, or check gdk_event_get_device() yourself on all events.
  • Think it’d be cool to handle the simultaneous pointers and want to go the extra mile? Again, track gdk_event_get_device(), and keep per-device state.

…And there’s more to come

There’s big plans for mutter in this cycle so tablet support will not be immediately merged, but be assured it will happen in due time. Gears are also grinding in order to have a version 2 of the tablet protocol, it will mostly focus on bridging pads (the sets of buttons and sliders along the sides of some tablets) to wayland clients, something which I expect to be an outstanding improvement compared to X11, so… stay tuned!

I shot the Tracker

In free software some fashions never change, and some are particularly hard to overcome. Today I’ll talk about the “Tracker makes $ANYTHING slow” adage, lately gnome-music being on the spotlight here. I’m glad that I could personally clear this up to some individuals on the hackfests/conferences I’ve been around lately.

But convincing is a never ending labor, there’s still confused people around the internets, and disdainful looks don’t work as well over there. The next best thing I could do is fixing things myself to make Tracker look less like the bad guy. So, from the “can’t someone else do it” department, here’s some commits to improve the situation. The astute reader might notice that there is nothing about tracker in these changes.

There’s of course more to it, AFAICT other minor performance hits are caused by:

  • grilo emitting one signal per media item found, which is somewhat bad on huge lists
  • icon view performance generally sucking, which makes scrolling not as smooth in the Albums view while covers are loading
  • After all that, well sure, Tracker queries can be marginally optimized.

This will eventually hit master and packages, until then, do me a favor an point to this post anyone still saying how Tracker made gnome-music slow.

Developer experience hackfest

Kind of on topic with this, I attended a few weeks ago to the Developer experience hackfest. Besides trying to peg round pieces into square holes, after some talking with how much of a steep barrier was Sparql as a prerequisite for accessing Tracker data, I started there on a simpler query API that abstracted all of these gritty details. Code is just shaping up there, but I expect it to cover the most common usecases. I must thank Red Hat and Collabora for enabling me to go there, all the people there, and particularly Philip for being such a great host.

Oh, and also attended Fosdem and Devconf, even talked on the last one about the input plans going on in GNOME, busy days!

GNOME 3.14 approaching

With 3.14 almost out of the door, it seems like a good opportunity to blow the cobwebs of this blog and highlight some shiny new features I was involved in during this cycle:

Gesture support in GTK+

It’s finally happening :), GTK+ 3.14 brings in infrastructure to handle gestures. Maybe the word “gesture” is automatically associated to “multitouch”, but actually this infrastructure is meant to deal with all kinds of pointer/touch input, and as such is used fairly intensively now within GTK+ itself, so even mouse users will be unknowingly using this.

These gesture objects are of course readily available for applications too. Individually, these are quite simple in essence, but can be easily stitched together to compound higher-level behavior. So far, eog and evince (and by extension gnome-documents) have bitten the bullet and now handle some of the gestures you’d expect on touchscreens there, the API documentation and HowDoI are available for anyone wanting to follow.

Gesture support in gnome-shell

Just to feed the cabal claiming that gnome-shell is designed for tablets, I also managed this cycle to add gesture infrastructure in mutter, so it is able to pre-process touch events before applications do, these events are then handled through ClutterGestureActions, or “rejected” and eventually handled by the application. This has been put to use in gnome-shell, making some actions readily available through multitouch.

Edge swipes

Showing the overview

Switching workspaces

Switching apps in a workspace

Note: Recorded with the help of this, bug #732367 is yet another 3.16 todo item…

Freebase support in libgdata

This one feature has certainly went underpublicited, and I found myself with little time to make use of it :(, but I nonetheless find that very interesting things can be done with this. Freebase is a community-maintained knowledge base (currently backed by Google), as seen on its homepage it is extremely wide in topics (some better covered than others), and has a very well defined ontology, think of it as a companion to Tracker on the web.

There are dedicated methods for the most usual ways to query data (search, lookup on topic…), but additionally Freebase offers a powerful MQL query method. MQL is very analogous to SPARQL, with the main difference that it’s based on JSON. All together allows for querying in very creative ways from very various data, a few examples being:

  • The mandatory “query for movie/album info” example, actually these topics are the best covered.
  • Fetching stock images for movies/cities/landmarks/directors/…, you name it.
  • Looking up monuments close to a geolocation.
  • Getting direct links to Wikipedia, in your language.

Looking forward for 3.15

It is almost time to celebrate, but I evidently won’t sit twiddling my thumbs :), a few items I’d like to tackle on the next cycle are:

  • During the 3.14 cycle I got briefly started on adding optional gesture integration to GtkStack and a new “tabs” widget, now it sounds like a good time to resume. I would also like to make gestures used integrally for event handling in GTK+ (we’re already just a few widgets away from that goal)
  • There’s a few gaps still left to solve on handling touchpad gestures, which I’d like to get closed ASAP, at least for touchpads handling >2 fingers on X11.
  • Improving gnome on Wayland. I merely got started this cycle adding DnD/clipboards support to GTK+ and bringing touchscreen behavior on mutter more or less on par to X11’s. There’s a few input details that need shuffling so they’re done in the same place on X11/wayland (pointer cursor visibility, device mapping…), and I hope the timing to be right to bring in a sort of tablet support (libinput and wayland protocol details have been shaping up despite my times on, most times off help, thanks Peter, Lyude, Jason et al!), I will be putting my hacking efforts wherever it’s necessary to make this happen.
  • WebKitGTK+ could be definitely made friendlier on touchscreens, additionally to the DOM touch event support it already does, it would be great to handle touch scroll/pinch/zoom as you can see in other pure GTK+ apps now.

A late devX hackfest report, gestures

Last week the developer experience hackfest took place in Berlin. Quite some blogging has happened during and since, nonetheless this was a pretty good excuse to blow the cobwebs off my blog and chime in too :).

The event was quite intense, plenty of discussion happened around the GNOME platform, developer tools, API documentation generation, and introspection. My main point of interest there was GTK+, it was great to see a roadmap shape up, and a consensus about the gestures branch (more about that below). I also offered to look into doing something that could replace GtkNotebook in the future, which I started hacking on during the hackfest. I also had a nice opportunity to chat with Philip about Freebase support in libgdata, and solve a few questions around Tracker.

Other than that, nice chatting, and a very fine choice of places for food and beverages. Many thanks to Chris Kühl and the Endocode guys for making this as much enjoyable!


One point in the GTK+ meeting agenda was gesture support. The gestures branch (which has had a somewhat intermitent history during the last ~2.5 years) has lately shaped up into something that was agreed mergeable for 3.14. This branch introduces “gesture” objects that abstract away all the intricacies of (multi)touch management. These objects handle events, tracking individual touches, when gestures enters in a “recognized” state, high-level signals will be emitted. This has been made to work parallelly to regular event delivery, in order to allow for different degrees of interoperation with handlers there, and making single-touch gestures to work with pointer events is just a call away.

There is of course a need of cooperation between gestures, both within a widget, and across the widget “stack” that’s receiving events for each individual touch, each of those will be triggering a number of gesture objects as the touchpoint moves. To cater for that, gestures have a none/claimed/denied state on each tracked touch, and changes in those are communicated across the stack (possibly causing cancellation on child widgets also handling the touchpoint, or cancelling other same-level gestures that previously claimed the sequence).

Intra-widget, some simultaneity may be wanted when handling touchpoints (eg. long press and nth-click handling), gestures may be grouped so those share the same state and can act upon the same events, a widget can have multiple mutually-exclusive gesture groups (eg. scrolling and “switch page” panning on a scrolledwindow).

So that is it! in a very tiny nutshell, the API added in the branch is documented, that will be a more accurate source :). Now, a video showcasing this, unedited because Pitivi didn’t get along today. A 42 seconds journey from well-known lands to barren plains of insanity, made manageable thanks to this branch:

The branches for eog and evince are available too, for anyone willing to play soon with this. But as said, hopefully everything will be soon in master!

Old times, new times

The facts narrated here happened on early November, but I’m apparently blog-constipated, hence the extra month…

So, after an overall of 6.5 years working on Lanedo (and former Imendio), I made my last day there on October 31st. After all that time, I consider it a privilege having worked with them, they’re a smart bunch and I’m very sure they’ll be doing fine ;)

After that? Shiny new times, this happened:


On November 1st I joined Red Hat! And so far it’s been an incredible experience. With them, I’ve spent much of my time tinkering, surrounded by tablets, touchscreens and other input devices, and will be doing so in the foreseeable future, so expect bursts of activity around the input area :).

Introducing Mechane, GUADEC

Around early 2012, I made some rough experiments around input and event delivery that required nothing but Glib, X and cairo. It was only a bit later, when I had a basic event delivery and rendering model, when I started thinking on how would have windowing subsystems in toolkits evolved if we had all the tools and lessons learnt that we now have.

Fast forward some months of times on, times off development, and I think this turned into something worth sharing, all say hello to Mechane. On that time, ideas flowed, many were ditched, wayland was fully embraced, and this experiment is eventually turning into a lean codebase:


What is Mechane?

Mechane is a concise wrapper to the windowing system, with all expected modern features integrated in the core. There’s several widgets, ranging from simple to complex, exercising these features. Even though it internally has a backend abstraction, there is only a wayland implementation.

Some gory details

The basic building block in Mechane is a MechArea, as the codebase was in principle built to check input-related concepts, it was made so elements could be reused most readily, so a micro-widget approach was taken, similar to Rapicorn[1].

Under this approach, simple MechAreas do as little as possible, and those work as building blocks to create growingly complex composited MechAreas. There is an emphasys on using GInterfaces for the most usual data exchange with an UI, as composite areas have facilities to delegate full interfaces on child areas. This all allows for making composited areas that are fairly opaque on the outside, while the little inner pieces focus on their business.

Transformations are also dealt with as a core feature, any MechArea can be assigned a transformation matrix. The coordinate space on events is fully virtualized (starting on 0,0 on the area receiving the event), there’s though ways to translate between coordinate spaces whenever needed. One can think transformations is mainly necessary for effects, scrolling is the most notable example of transformations in Mechane though.

On this design, the basic MechArea that does text display/edit will be terribly ubiquitous if used as the foundation for entries, labels, text views and whatnot. As such quite some emphasis was made at having something that scales very well, and the result has gotten really nice: MechTextView can handle massive ammounts of text, letting you instantaneously scroll anywhere in the buffer [2], no CPU grinding happens on the background on any situation.

And of course, wayland offers some possibilities that weren’t as readily available in X11, mechane so far handles SHM and EGL surfaces, even though this is all quite transparent thanks to cairo (unless of course you want to mess around with that)

With great power, comes pure crack
With great power, comes pure crack

But this all started around input! there’s a collection of gesture controllers and incipient experiments around making those cooperate across a hierarchy, there’s though interesting puzzles around grab transfers and cancellation that are still subject to investigation.

This all makes a very nice result, every piece is very self-contained, in a way that many things can be reused or made to work elsewhere, eg. a “complex” widget like a scrollable view came out like a piece of cake. As for rendering performance, going through software rendering it beats comfortably the 60fps mark in most situations on now oldish laptops.

In such early stages, there’s of course quite some open fronts: first and foremost is input handling, also some wayland features are still unsupported, like subsurfaces and clipboards, and the techniques used for validation-less textviews is worth investigating on icon/list views.

Of course

Kudos to Kristian Høgsberg and all wayland/weston/cairo folks. I’m happy how lean the Mechane is turning out to be, but it could only be so lean thanks to their work.

Going to GUADEC!

For the 11th consecutive time, I’m attending GUADEC! Thanks to Lanedo I’ll make it to Brno in 1.5 days. Aleksander and I will be talking about Tracker. So if you want to talk about Tracker, Mechane, GTK+ or input handling, I’ll be around, see you there!

[1] Yes, this is the second toolkit-thingy from your fellow lanedians, there must be some irony in it.
[2] Well, somewhere between 500 and 700MB of text it goes from “snappy” to “responsive” here due to other arising bottlenecks, still more text I can read without my eyelids falling off.

Snippets in Tracker’s full text search results

After quite some time without touching Tracker code, last week I finally could get back to a branch that’s been sitting there for some time now. On fts4, sqlite requirements have been updated to >=3.7.9 so we can stop compiling our custom FTS module and start using what comes with libsqlite.

What does this mean? Internally there’s less code on our plate (and non-stale), and external content support in FTS tables brings us no performance nor file size decreases. plus we get all recent hot improvements in sqlite releases for free.

A bit more on the user point of view, a feature that became possible with this swich is the support for fts:snippet(), which you can use in SparQL queries to get snippets of the matched text:

$ tracker-sparql -q "select nie:url(?file) fts:snippet(?file) fts:rank(?file) where { ?file a nfo:Document ; fts:match 'reference' } order by desc (fts:rank(?file)) limit 3"
file:///home/carlos/Documents/Papers/pdf_reference_1-7.pdf, ...Reference Streams G8.1872911 Cross-Reference..., 46.0
file:///home/carlos/Downloads/addison.pdf, ...GLU are described in the OpenGL
Reference Manual. The more useful GLU..., 40.0
file:///home/carlos/Downloads/ThesisHo.pdf, ...A8 ]+ ) is also included for
reference. In the third experiment, we apply..., 40.0

So its easier to the eye, tracker-needle search tool now also shows snippets where available, providing a nice context for the matched content


Remember that FTS searches apply to any property that’s specified by the ontology as tracker:fulltextIndexed, you can run this to find out:

tracker-sparql -q "select ?prop rdfs:label(?prop) tracker:weight(?prop) where { ?prop tracker:fulltextIndexed true }"

There’s also the possibility you had no idea what I’m talking about :), If desktop semantic search still tickles your curiosity I’ll point you to the fine gathered documentation about Tracker.

This work was kindly sponsored by Lanedo.

Introducing Cossa, a GTK+ theme previewer for gedit

Earlier today I’ve pushed gedit-cossa, a plugin for gedit to help writing CSS for GTK+ themes, it is able to display a number of samples, loaded from GtkBuilder files.

Here’s a video demonstrating how it works:

Cossa is still in pretty early development stages, immediate plans include:

  • Hooking CSS parsing errors to the gedit view
  • Adding a lot more samples, these should range from simple examples (basic widgets in different states) to complex (basic main window sample, preferences dialogs, …)

Anyone is welcome to help, specially in the second point, as it is fairly straightforward to add new samples.