One of the most notable and enduring Linux desktop paradigms has been desktop effects: the coupling of various desktop environments with graphical niceties, also known as “eye candy.” With the advent of the XComposite extension in the mid 2000s, mainstream eye candy was taken to new levels through a small project called Compiz which used the texture from pixmap extension to apply hardware-accelerated effects to windows on the fly.
From Compiz came the effect that is known throughout the Linux world as “wobbly windows” or, for many developers, “that feature we aren’t implementing.” By applying OpenGL vertex transforms to the textures created using the previously-mentioned extension, windows would appear to deform when moved and resized. This has persisted in being the benchmark for all graphical desktop effects for a decade, and numbers may not go high enough to enumerate all the users who have asked for this effect in Enlightenment over the years.
Implementing Client-Side Post Processing Effects in Wayland
Partial implementations of it have been developed a couple times over the lifetime of E17 and onwards. First, there was Ecomorph in 2009 which effectively chained a modified version of Compiz to the window manager; next came the ecompiz module in 2015 which allowed loading unmodified compiz effect modules – using huge amounts of overdraw – to approximate the corresponding effects in newer versions of Enlightenment. None of these are great solutions in today’s modern world of GLES/EGL, Wayland, and embedded devices. However, Compiz uses Xlib internally as well as GLX, enforcing a dependency on X11 and the Xserver. Furthermore, given the server-side location of the effect handling, this will not play too nicely with the client-oriented Wayland protocol.
And so it was, armed with this knowledge, that a pair of procrastinating-yet-ambitious Samsung OSG graphics engineers set out to
annoy improve the Wayland world with client-side, window, post processing effects. The goal: implement wobbly windows using the client-side decoration region and a lot of elbow grease. The difficulty level was entirely precedented, but the usefulness levels were off the charts. [duly noted. /management]
The first step in this overwhelmingly important task was to determine how to manage putting Wayland applications into sandboxes. The wobbly algorithm for moving windows requires the velocity of the window in order to calculate wobbling, something that’s impossible to acquire using the normal Wayland protocol due to a lack of position notifications for applications. A decision was made to create a protocol extension that would provide applications with relative movement info in order to maintain the client sandbox while solving the problem. The protocol would also provide drag info, i.e when a window began to be dragged and when it stopped being dragged, in order to accurately anchor the effect. Using both of these methods, the protocol would allow Wayland Wobbly Windows (WWW) by means of Client-Side Shimmying (CSS).
Figuring out a solution to this basic issue allowed the project to begin in earnest. The methodology was to inject the wobbly algorithm from the Compiz plugin into the toolkit rendering engine and utilize the client-side decoration space to draw the effect without modifying input regions or window geometry. This would allow the compositor to accurately position the window during deformations, resulting in non-rectangular geometry while still providing an expandable area for the wobble to render into. Moreover, when the wobble begins the opaque region should be temporarily unset to ensure that the transparency resulting from the deforms will be blended properly in the compositor. The battle began, and there were many issues along the way.
What Does it Take to Wobble?
Canvas output integration turned out to be the first problem in the implementation. Compiz uses OpenGL calls from very early versions of the desktop API that are not supported in GLES; these parts would need to be rewritten to use supported calls such as converting all the geometric primitives to use GL_TRIANGLES instead of GL_QUADS. GLES also requires fragment and vertex shaders, these need to be added in order to get a successful draw. All of this, of course, was on top of the work required to port the effect algorithm to work inside the canvas output engine.
Another minor issue that popped up during the effect algorithm adaptation phase was that Compiz draws its composited effects using screen geometry, so this part would need to be refactored in order to manage each surface in isolation. Fortunately, this turned out to be easy to resolve: the sandboxing of applications made it so that each surface could be used as “the screen” so that geometry translations became unnecessary.
More significantly, when generating the move effect it became apparent that the anchor point for the pointer’s location was in the wrong place. This was the result of a problem with EFL’s canvas framespace: the method for implementing client-side decoration.
With canvas framespace, any window decor is included in a region outside the reachable canvas bounds; this allows the objects in the window to report coordinates originating at 0,0 on the canvas. The output engine still renders the framespace region normally, but when doing work with the engine to apply post-processing effects it can cause the pointer coordinates to appear strange. For example, if the framespace includes a 5×10 region to the left and top of the window, the top-left corner of the surface would be -5,-10, and the pointer coordinates would be reported in that location.
Perhaps the biggest annoyance during the process was applying the effect to windows during resize. Resizing a window from any part of the left or top edges requires moving the window, which results in animation. Coupled with the resizing, this had some interesting side effects:
Upon further inspection of the original Compiz code and the default configuration, resize is blocked from triggering the effect. Applying this to the updated effect code resolved the issue for the time being.
As a bonus, a small issue encountered during the very early stages of the project was the velocity from the initial placement of the window creating some even more interesting animations:
Not only that, sending the initial placement move deltas was effectively notifying clients of their window positions: an action that is frowned upon in the Wayland community. The initial placement move deltas are no longer sent, resolving both of these issues.
Overall, this was a fun project that shows the power of some Wayland features. It’s worth noting that while this implementation added a copy to the rendering pipeline as a result of the framebuffer object used during the postprocessing stage, it’s possible to implement the effect without any extra copies using more complex vertex shaders. This was skipped due to the time we had budgeted for the project, but it would likely be worth looking into in order to provide improved performance on embedded and lower spec device hardware.
That’s it for now, but here’s a look at the results of the project:
Just kidding, that was another bug that was found along the way. Here’s the real deal: