Modernizing GTK’s macOS backend (again)

Since the early days of working on the macOS backend for GTK 4 I knew eventually we’d have to follow suit with what the major browsers were doing in terms of drawing efficiency. Using OpenGL was (while deprecated, certainly not going anywhere) fine from a performance standpoint of rendering. But it did have a few drawbacks.

In particular, OpenGL (and Metal afaik) layers don’t have ways to damage specific regions of the GPU rendering. That means as we’d flip between front and back buffers, the compositor will re-composite the whole window. That’s rather expensive for areas that didn’t change, even when using a “scissor rect”.

If you’re willing to go through the effort of using IOSurface, there does exist another possibility. So this past week I read up on the APIs for CALayer and IOSurface and began strapping things together. As a life-long Linux user, I must say I’m not very impressed with the macOS experience as a user or application author, but hey, it’s a thing, and I guess it matters.

The IOSurface is like a CPU-fronted cache on a GPU texture. You can move the buffer between CPU and GPU (which is helpful for software rendering with cairo) or leave it pretty much just in the GPU (unless it gets paged out). It also has a nice property that you can bind it to an OpenGL texture using GL_TEXTURE_RECTANGLE. Once you have a texture, you can back GL_FRAMEBUFFER with it for your rendering.

That alone isn’t quite enough, you also need to be able to attach that content to a layer in your NSWindow. We have a base layer which hosts a bunch of tiles (each their own layer) whose layer.contents property can be mapped directly to an IOSurfaceRef, easy peasy.

With that in place, all of the transparent regions use tiles limited to the transparent areas only (which will incur alpha blending by the compositor). The rest of the area is broken up into tiles which are opaque and therefore do not require blending by the compositor and can be updated independently without damaging the rest of the window contents.

You can see the opaque regions by using “Quartz Debug” and turning on the “Show opaque regions” checkbox. Sadly, screen-capture doesn’t appear to grab the yellow highlights you get when you turn on the “Flash screen updates” checkbox, so you’ll have to imagine that.

Opaque regions displayed in green

This is what all the major browsers are doing on macOS, and now we are too.

This also managed to simplify a lot of code in the macOS backend, which is always appreciated.

https://gitlab.gnome.org/GNOME/gtk/-/merge_requests/4477

GSignalGroup and GBindingGroup

Some of the first abstractions we made when creating GNOME Builder are now available to everyone in GObject!

In particular, writing Text Editors requires tracking lots of information and changes from various sources. Sometimes those changes come from 2nd-degree objects via the object you really care about.

For example, with a GtkTextView that might mean tracking changes to a GtkTextBuffer, GtkTextTagTable, or many other application-specific accessory objects through the form of signals and properties.

To make that easier, we now have GSignalGroup and GBindingGroup merged directly into GObject. At their core, they allow you to treat a series of signals or bindings as a group and connect/disconnect them as a group. Furthermore, you can use g_object_bind_property() to bind the target instance to the binding/signal group removing even more code from your application. More than likely you’ll be better about supporting how things can change out from under you too.

In the process of upstreaming these objects, we did a bit of hardening, API cleanup, and thread-safety guarantees.

Now, go and delete some code!