Multitouch is near…

So, after a few strives during the last year, the multitouch Xorg patches were posted and merged to master last month, making multitouch available in the upcoming Xorg release. This turns the multitouch GTK+ branch into a suitable candidate for GTK+ 3.4, which obviously deserves a video demoing what’s up there:



Hopefully soon in master, very soon…

15 Responses to “Multitouch is near…”

  1. luc says:

    one more time, WOW!

  2. […] beginnen auch die GUI-Bibliotheken diese umzusetzen, allen voran GNOMEs GTK+. Wie Carlos Garnacho in seinem Blog demonstriert, ist der Multitouch-GTK+-Zweig bereits voll […]

  3. Wout says:

    Cant’ wait, looks good. What the relationship with projects like utouch, QT? Are gestures generic or toolkit specific?

  4. Alex says:

    That’s…. friggin’ awesome! Can’t wait! Will that work with an Apple Magic Trackpad for example?

  5. carlosg says:

    @Wout: These 2 projects would dip the toes in the same waters, XInput 2.2. Gestures are toolkit specific, but extensible via API.

    @Alex: Yeah, should work too, I didn’t have one to test though

  6. Jeremiah says:

    Rock and roll! So excited to see this working.

  7. […] Carlos Garnacho nos adelanta en su blog que los parches multitouch para el Xorg ya están listos y que no sería para nada descabellado pensar en un Gnome 3.4 que ya incluyera soporte para esta nueva característica. […]

    [WORDPRESS HASHCASH] The comment’s server IP (76.74.248.236) doesn’t match the comment’s URL host IP (74.200.243.251) and so is spam.

  8. Are those video encoding artefacts, or is my mplayer broken?

  9. Mohamed Ikbel Boulabiar says:

    Carlos,
    in the end you showed a gesture recognition app.
    Any hints on the algorithms used there ?
    And why you recognize them only in the end ?

  10. carlosg says:

    @Andres: From what I can tell, it should work, driver problems aside :)

    @Marius: the video is downscaled, it was that or a much larger file…

    @Mohamed: see http://git.gnome.org/browse/gtk+/tree/gtk/gtkgesturesinterpreter.c?h=multitouch#n1332 , the gestures interpreter only acts when the last touch is lifted from the screen. Although this implementation could also allow for “tentative” gestures, for the typical usecases (is zoom, is rotation), it is more reliable to use GtkWidget::multitouch-event and and the utility functions than doing a statistical analysis of the events.

  11. DexterJ says:

    Hey There Blogs,
    Neat Post, does anyone have a link to a nightvision camera that can see things in nightvision even when there’s a source of light near it, pointing in the same direction?

    im doing this to build a multitouch console, but I’d really like to skip around the infared part for right now, though I know i’ll have to use a firewire camera and infared lights later on.

    but the webcam I need now needs to be USB and for under $300 USD
    Nice One!

  12. skierpage says:

    That’s really great toolkit work! I love the spot feedback and gesture feedback.

    Are gestures orthogonal to multitouch, since you can also make them with a mouse or trackpad? In the C file you reference I see initialize_gestures_ht() sets up swipe and circular gestures, but at 1:10 the video recognizes ‘M’ and ‘I I’. I assume you haven’t got a full handwriting recognition engine (?!!), but will GTK offer predefined Roman letters?

  13. carlosg says:

    @DexterJ: I remember there were several tutorials in youtube about DIY touch surfaces with boxes and cheap webcams, that could be made to work with TUIO, which even has an adapter driver for Xorg, but I haven’t done that myself

    @skierpage: Yes, gestures interpretation is orthogonal to the device, so you can surely have gesture recognition on mouse input, although the GtkWidget-level convenience API for this only does it for touch devices.

    About the ‘M’ and ‘II’ gestures, those are gestures added programmatically by the test application, in addition to the stock ones (see http://git.gnome.org/browse/gtk+/tree/tests/testgestures.c?h=multitouch#n310 ). Using this for handwriting recognition as-is is indeed a bit of a stretch, although could be used as the base for such specialized engine