Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

It seems like Virtualgl does not want to work with GtkGLArea #158

Closed
nicholizer opened this issue Feb 19, 2021 · 8 comments
Closed

It seems like Virtualgl does not want to work with GtkGLArea #158

nicholizer opened this issue Feb 19, 2021 · 8 comments

Comments

@nicholizer
Copy link

I compiled the example from https://gtk.dashgl.com/Brickout/Draw_a_Triangle/ and it runs as expected, but when I tried to run it with vglrun, it displayed window with text:

no available configurations for the given rgba pixel format

and terminal throwed:

niko@niko-Satellite-C655:~/Desktop/Untitled Folder$ vglrun +v ./demo3-app
[VGL] Shared memory segment ID for vglconfig: 163856
[VGL] VirtualGL v2.6.4 64-bit (Build 20200626)
[VGL] NOTICE: Replacing dlopen("libGLX.so.1") with dlopen("libvglfaker.so")
[VGL] Opening connection to 3D X server :0
[VGL] NOTICE: Replacing dlopen("libGL.so.1") with dlopen("libvglfaker.so")
[VGL] WARNING: Could not load function "glXBindSwapBarrierNV"
[VGL] WARNING: Could not load function "glXJoinSwapGroupNV"
[VGL] WARNING: Could not load function "glXQueryFrameCountNV"
[VGL] WARNING: Could not load function "glXQueryMaxSwapGroupsNV"
[VGL] WARNING: Could not load function "glXQuerySwapGroupNV"
[VGL] WARNING: Could not load function "glXResetFrameCountNV"
[VGL] WARNING: Could not load function "glXBindSwapBarrierNV"
[VGL] WARNING: Could not load function "glXJoinSwapGroupNV"
[VGL] WARNING: Could not load function "glXQueryFrameCountNV"
[VGL] WARNING: Could not load function "glXQueryMaxSwapGroupsNV"
[VGL] WARNING: Could not load function "glXQuerySwapGroupNV"
[VGL] WARNING: Could not load function "glXResetFrameCountNV"
[VGL] WARNING: Could not load function "glXBindSwapBarrierNV"
[VGL] WARNING: Could not load function "glXJoinSwapGroupNV"
[VGL] WARNING: Could not load function "glXQueryFrameCountNV"
[VGL] WARNING: Could not load function "glXQueryMaxSwapGroupsNV"
[VGL] WARNING: Could not load function "glXQuerySwapGroupNV"
[VGL] WARNING: Could not load function "glXResetFrameCountNV"
[VGL] WARNING: Could not load function "glXBindSwapBarrierNV"
[VGL] WARNING: Could not load function "glXJoinSwapGroupNV"
[VGL] WARNING: Could not load function "glXQueryFrameCountNV"
[VGL] WARNING: Could not load function "glXQueryMaxSwapGroupsNV"
[VGL] WARNING: Could not load function "glXQuerySwapGroupNV"
[VGL] WARNING: Could not load function "glXResetFrameCountNV"
[VGL] WARNING: Could not load function "glXBindSwapBarrierNV"
[VGL] WARNING: Could not load function "glXJoinSwapGroupNV"
[VGL] WARNING: Could not load function "glXQueryFrameCountNV"
[VGL] WARNING: Could not load function "glXQueryMaxSwapGroupsNV"
[VGL] WARNING: Could not load function "glXQuerySwapGroupNV"
[VGL] WARNING: Could not load function "glXResetFrameCountNV"
[VGL] WARNING: Could not load function "glXBindSwapBarrierNV"
[VGL] WARNING: Could not load function "glXJoinSwapGroupNV"
[VGL] WARNING: Could not load function "glXQueryFrameCountNV"
[VGL] WARNING: Could not load function "glXQueryMaxSwapGroupsNV"
[VGL] WARNING: Could not load function "glXQuerySwapGroupNV"
[VGL] WARNING: Could not load function "glXResetFrameCountNV"
[VGL] WARNING: Could not load function "glXBindSwapBarrierNV"
[VGL] WARNING: Could not load function "glXJoinSwapGroupNV"
[VGL] WARNING: Could not load function "glXQueryFrameCountNV"
[VGL] WARNING: Could not load function "glXQueryMaxSwapGroupsNV"
[VGL] WARNING: Could not load function "glXQuerySwapGroupNV"
[VGL] WARNING: Could not load function "glXResetFrameCountNV"
[VGL] WARNING: Could not load function "glXBindSwapBarrierNV"
[VGL] WARNING: Could not load function "glXJoinSwapGroupNV"
[VGL] WARNING: Could not load function "glXQueryFrameCountNV"
[VGL] WARNING: Could not load function "glXQueryMaxSwapGroupsNV"
[VGL] WARNING: Could not load function "glXQuerySwapGroupNV"
[VGL] WARNING: Could not load function "glXResetFrameCountNV"
[VGL] WARNING: Could not load function "glXBindSwapBarrierNV"
[VGL] WARNING: Could not load function "glXJoinSwapGroupNV"
[VGL] WARNING: Could not load function "glXQueryFrameCountNV"
[VGL] WARNING: Could not load function "glXQueryMaxSwapGroupsNV"
[VGL] WARNING: Could not load function "glXQuerySwapGroupNV"
[VGL] WARNING: Could not load function "glXResetFrameCountNV"
[VGL] NOTICE: Replacing dlopen("libGL.so.1") with dlopen("libvglfaker.so")
on realize
Unknown error

@nicholizer
Copy link
Author

I solved this issue. The problem was with GtkWindow. As is said in cztomczak/cefcapi#9 (comment), GTK+ > 3.15.1 uses an X11 visual optimized for GTK+'s OpenGL stuff. There, authour of that comment suggests to force GTK to use the default X11 visual instead the GTK's blessed one(I made minor changes in his code):

GtkWidget* create_gtk_window() {
    printf("create_gtk_window\n");
    // Create window.
    GtkWidget* window = gtk_window_new(GTK_WINDOW_TOPLEVEL);
    ////...
    // GTK+ > 3.15.1 uses an X11 visual optimized for GTK+'s OpenGL stuff
    // since revid dae447728d: https://github.com/GNOME/gtk/commit/dae447728d
    // However, it breaks CEF: https://github.com/cztomczak/cefcapi/issues/9
    // Let's use the default X11 visual instead the GTK's blessed one.
    GdkScreen* screen = gdk_screen_get_default();
    GList* visuals = gdk_screen_list_visuals(screen);
    printf("n visuals: %u\n", g_list_length(visuals));
    GdkX11Screen* x11_screen = GDK_X11_SCREEN(screen);
    g_assert(x11_screen != NULL);
    Visual* default_xvisual = DefaultVisual(GDK_SCREEN_XDISPLAY(x11_screen),
        GDK_SCREEN_XNUMBER(x11_screen));
    GdkVisual* default_visual = NULL;
    int i = 0;
    while (visuals != NULL) {
	GdkVisual* visual = GDK_X11_VISUAL (visuals->data);
        if (default_xvisual->visualid == gdk_x11_visual_get_xvisual(
           GDK_X11_VISUAL (visuals->data))->visualid) {
           printf("Default visual %d\n", i);
           default_visual = visual;
        }
        i++;
        visuals = visuals->next;
    }
    gtk_widget_set_visual(GTK_WIDGET(window), default_visual);
    ////gtk_widget_show_all(window);
    return window;
}

So, now I add this to my GTK-code and use create_gtk_window() instead of gtk_window_new(GTK_WINDOW_TOPLEVEL) and my programs work with vglrun.
Thank you for VirtualGL!

@dcommander
Copy link
Member

Glad you were able to work around the issue in your code, but I still need to investigate why it is occurring in VirtualGL, since other applications may also encounter it.

@dcommander
Copy link
Member

dcommander commented Mar 2, 2021

A description of the problem:

GDK first iterates through all available 2D X server visuals and chooses an appropriate "RGBA visual" (the first depth-32 visual with BGRA color masks) and an appropriate "system visual" (the first visual whose visual ID matches the default visual of the 2D X server.) In the case of TurboVNC, the RGBA visual that GDK picks is 0x68, and the system visual it picks is 0x21. So far so good, but then GDK re-orders the visuals such that depth-32 visuals are given precedence over depth-24 visuals and DirectColor visuals are given precedence over TrueColor visuals. The sorting algorithm that GDK uses moves the 2D X server's default visual (0x21) down in the list so that it is no longer the first depth-24 TrueColor visual. GDK then iterates through the re-ordered list of visuals and tries to pick a visual that matches the OpenGL attributes of the system visual, but since VirtualGL 2.x does not assign OpenGL attributes to 2D X server visuals, GDK matches another visual first. This then becomes the new system visual. GDK then iterates through the list of GLXFBConfigs and attempts to find one for which glXGetVisualFromFBConfig() returns the new system visual. That will never happen with VirtualGL 2.x, because VGL 2.x maps all of the 24-bit GLXFBConfigs to the first matching depth-24 TrueColor 2D X server visual (which will be the same as the initial system visual that GDK picks, not the new system visual that it picks after re-ordering.) Thus, it is the re-ordering of visuals that GDK performs in _gdk_x11_screen_init_visuals() that ultimately foils VirtualGL 2.x.

GDK is, to make a long story short, heavily relying on a 1:1 correspondence between GLXFBConfigs (which are on the 3D X server) and X visuals (which are on the 2D X server), which can't be guaranteed in VirtualGL. VGL 2.x works perfectly well with applications that start with a GLXFBConfig and request a visual to match, or with applications that select a visual using explicit OpenGL attributes (glXChooseVisual().) With the use of the VGL_DEFAULTFBCONFIG environment variable, VGL 2.x can be made to work with applications that iterate through the list of 2D X server visuals searching for one with a specific set of OpenGL attributes. However, VGL 2.x can't be made to work with applications that do the latter as well as expect any arbitrary 2D X server visual to have been mapped to a GLXFBConfig. Thus, the application-level workaround you implemented above (explicitly selecting a GDK visual) is the only workaround for VGL 2.x.

Fortunately, however, VirtualGL 3.0 includes a new feature that automatically assigns OpenGL properties, in round-robin fashion, to all available 2D X server visuals and preemptively chooses a GLXFBConfig for each. It also returns 0 when glXGetConfig(GL_USE_GL) is called for any 2D X server visual that doesn't have a GLXFBConfig attached (i.e. any 2D X server visual whose assigned OpenGL properties didn't match a GLXFBConfig in the 3D X server.) This effectively works around the GtkGLArea issue.

@biochem-fan
Copy link

@dcommander

I am using virtualgl_3.0_amd64.deb but has the same issue with GtkGLArea.
I am testing on https://github.com/ebassi/glarea-example.
Inserting the above code to use the default X11 visual solved the issue.

So I guess the problem still persists.

@dcommander
Copy link
Member

@biochem-fan Reproduced, but I swear it worked before. :| Now GTK is picking Visual 0x2c2 on my system, and I have no idea why. I know it isn't deliberately trying to break VGL, but you couldn't devise a better way to break VGL than the one that GTK is using. I don't see any purpose to their visual selection scheme. It seems needlessly complicated and fragile.

@dcommander dcommander reopened this Feb 15, 2024
@dcommander
Copy link
Member

Reopening because I think this same problem, or at least a very similar one, occurs with Chrome/ANGLE. (Refer to #229.)

@dcommander
Copy link
Member

It appears that GNOME/gtk@ca8d9fb in GTK4 might have fixed this, but I have no way to verify, since the GtkGLArea example only works with GTK3. Still, though it would be good for VGL to work around this for GTK3.

@dcommander
Copy link
Member

A proper fix for this has been implemented in 398e941. Please test.

dcommander added a commit that referenced this issue Feb 16, 2024
Starting with
GNOME/gtk@dae4477
(v3.15.2) and ending with
GNOME/gtk@1c55b32
(v4.3.2), GTK used glXGetConfig() to probe the OpenGL rendering
attributes of all X visuals, it picked "system" (opaque) and "RGBA"
(transparent) visuals based on those attributes, and it expected to find
a GLX FB config with a GLX_VISUAL_ID attribute corresponding to one of
those visuals.  (The purpose of this was to ensure that GTK windows can
always be used for OpenGL rendering, since the decision to enable OpenGL
rendering may occur after the window is created.)

Starting with
GNOME/gtk@8c7623d
(GTK v3.15.2) and ending with
GNOME/gtk@62bac44
(GTK v4.3.2), GTK cached the system and RGBA visual IDs in an X root
window property to avoid re-picking the visuals every time GTK was
initialized on a particular X display.

These mechanisms only worked reliably if there was a 1:1 correspondence
between 2D X server visuals and GLX FB configs, which isn't the case
with VirtualGL.  More specifically, if GTK was initialized on the 2D X
server without VirtualGL, then GTK used the 2D X server's GLX
implementation to probe the OpenGL rendering attributes of the 2D X
server visuals.  That caused GTK to pick and cache different visuals
than it would have picked had VirtualGL been active, and it was likely
that none of the GLX FB configs returned by VirtualGL had a
GLX_VISUAL_ID attribute corresponding to one of those visuals.

The easiest workaround is for the VirtualGL Faker to delete the
GDK_VISUALS X root window property in the body of XOpenDisplay(), thus
forcing GTK to re-pick the system and RGBA visuals using VirtualGL's
interposed version of glXGetConfig().

Fixes #158
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants