Over a few releases now, and culminating in Gtk+ 3.10 we have restructured the way drawing works internally. This hasn’t really been written about a lot, and I still see a lot of people unaware of these changes, so I thought I’d do a writeup of the brave new world.
All Gtk+ applications are mainloop driven, which means that most of the time the app is idle inside a loop that just waits for something to happen and then calls out to the right place when it does. On top of this Gtk+ has a frame clock that gives a “pulse” to the application. This clock beats at a steady rate, which is tied to the framerate of the output (this is synced to the monitor via the window manager/compositor). The clock has several phases:
The phases happens in this order and we will always run each phase through before going back to the start. The Input events phase is a long stretch of time between each redraw where we get input events from the user and other events (like e.g. network i/o). Some events, like mouse motion are compressed so that we only get a single mouse motion event per clock cycle.
Once the event phase is over we pause all external events and run the redraw loop. First is the update phase, where all animations are run to calculate the new state based on the estimated time the next frame will be visible (available via the frame clock). This often involves geometry changes which drives the next phase, Layout. If there are any changes in widget size requirements we calculate a new layout for the widget hierarchy (i.e. we assign sizes and positions). Then we go to the Paint phase where we redraw the regions of the window that needs redrawing.
If nothing requires the update/layout/paint phases we will stay in the event phase forever, as we don’t want to redraw if nothing changes. Each phase can request further processing in the following phases (e.g. the update phase will cause there to be layout work, and layout changes cause repaints).
There are multiple ways to drive the clock, at the lowest level you can request a particular phase by gdk_frame_clock_request_phase() which will schedule a clock beat as needed so that it eventually reaches the requested phase. However in practice most things happen at higher levels:
- If you are doing an animation, you can use gtk_widget_add_tick_callback() which will cause a regular beating of the clock with a callback in the update phase until you stop the tick.
- If some state changes that causes the size of your widget to change you call gtk_widget_queue_resize() which will request a layout phase and mark your widget as needing relayout.
- If some state changes so you need to redraw some area of your widget you use the normal gtk_widget_queue_draw() set of functions. These will request a paint phase and mark the region as needing redraw.
There are also a lot of implicit triggers of these from the CSS layer (which does animations, resizes and repaints as needed).
During the Paint phase we will send a single expose event to the toplevel window. The event handler will create a cairo context for the window and emit a GtkWidget::draw() signal on it, which will propagate down the entire widget hierarchy in back-to-front order, using the clipping and transform of the cairo context. This lets each widget draw its content at the right place and time, correctly handling things like partial transparencies and overlapping widgets.
There are some notable differences here compared to how it used to work:
- Normally there is only a single cairo context which is used in the entire repaint, rather than one per GdkWindow. This means you have to respect (and not reset) existing clip and transformations set on it.
- Most widgets, including those that create their own GdkWindows have a transparent background, so they draw on top of whatever widgets are below them. This was not the case before where the theme set the background of most widgets to the default background color. (In fact, transparent GdkWindows used to be impossible.)
- The whole rendering hierarchy is captured in the call stack, rather than having multiple separate draw emissions as before, so you can use effects like e.g. cairo_push/pop_group() which will affect all the widgets below you in the hierarchy. This lets you have e.g. partially transparent containers.
- Due to backgrounds being transparent we no longer use a self-copy operation when GdkWindows move or scroll. We just mark the entire affected area for repainting when these operations are used. This allows (partially) transparent backgrounds, and it also more closely models modern hardware where self-copy operations are problematic (they break the rendering pipeline).
- Due to the above, old scrolling code is slower, so we do scrolling differently. Containers that scroll a lot (GtkViewport, GtkTextView, GtkTreeView, etc) allocate an offscreen image during scrolling and render their children to it (which is now possible since draw is fully hierarchical). The offscreen image is a bit larger than the visible area, so most of the time when scrolling it just needs to draw the offscreen in a different position. This matches contemporary graphics hardware much better, as well as allowing efficient transparent backgrounds.
In order for this to work such containers need to detect when child widgets are redrawn so that it can update the offscreen. This can be done with the new gdk_window_set_invalidate_handler() function.
This skips some of the details, but with this overview you should have a better idea that happens when your code is called, and be in a better position to do further research if necessary.
 If you need more mouse event precision you need to look at the mouse device history.
 Actually each native subwindow will get one too, but that is not very common these days.