Multifoobar

Along the last days/weeks, I’ve gave myself the oportunity to tinker with XInput 2.1 in GTK+, now that the core concept of the involved changes seem settled (although corner cases are still being discussed).

Thanks to the previous port to XInput 2.0, handling multitouch events has been fairly easy, by enabling GDK_TOUCH_MASK events on your widget you can receive GDK_TOUCH_PRESS/MOTION/RELEASE events containing a touch ID, which is unique at that time for a touch event stream, so enabling that you can effectively receive events from simultaneous touches.

In addition, one can create GdkTouchClusters (related to a GdkWindow) and add touch IDs to it. from that point on, any update on these touch IDs stops sending individual GDK_TOUCH_MOTION events, instead sending GdkEventMultiTouch events, containing all latest events for the touch IDs in a GdkTouchCluster.

I’m pretty happy with the resulting API, althought there are some internal details left to improve, my main gripe currently is that implicit grabs in CSW still happen per device, this means no multi-widget multitouch yet within an app (also posing interesting problems with keyboard events redirection and explicit grabs). Another big item is looking into integrating this with gestures, as direct manipulation and gesturing need to go hand in hand, so that branch could be discontinued in favor of this one.

As I’m pretty lazy, and the results would be strikingly similar to what I previously posted, I won’t post any video, so you’ll have to settle for a screenshot:

This is sitting now in the xi21 branch. If you just compile this you won’t see much working though, the demo only reacts to touch events, which are only sent if XInput 2.1 is there, which requires compiling certain Xorg modules branches, and having a multitouch device at hand.

8 thoughts on “Multifoobar”

  1. @zekopeko: They lie side by side, AFAIK uTouch uses XInput 2.1 nowadays in Natty, so conceivably both could be interoperating in the same environment

  2. So if I’m understanding this correctly X receives the input, GTK+ sees what widget was touched and uTouch determines what the gesture was so it can tell the app what the user wants to do?

  3. @zekopeko: Roughly yes, X can deliver touch events to several clients, with touch ownership notifications and all, so a touch stream could be interpreted/handled/discarded simultaneously at different levels (app, wm, desktop-wide gestures daemons, …), this is one of the tricky parts of the input spec changes though.

Leave a Reply

Your email address will not be published. Required fields are marked *