Separate X screens

Anyone here unironically using Xorg with separate X screens?
It allows you to avoid the limitations of Xinerama such as bad handling of mixed refresh rates and VRR not working with multiple displays active.
I'm curious how many people are using such setup.

A Conspiracy Theorist Is Talking Shirt $21.68

Yakub: World's Greatest Dad Shirt $21.68

A Conspiracy Theorist Is Talking Shirt $21.68

  1. 2 years ago
    Anonymous

    Mixed refresh rates actually work perfectly fine with randr and xcb. I think VRR is indeed broken though.

    • 2 years ago
      Anonymous

      >Mixed refresh rates actually work perfectly fine with randr and xcb.
      Do applications actually use the correct refresh rate on all displays? Do they not just always run at the refresh rate of the primary display?

      • 2 years ago
        Anonymous

        Many don't run correctly and it's often mistakenly believed that it's an xorg limitation, but it's actually not. I'd wager it's because most people still use xlib which is limited in a lot of ways but it's way easier to program in. Also xcb's documentation is trash. However, with xcb you can correctly detect dynamic events like your window going to a different monitor and updating your rendering appropriately.

        • 2 years ago
          Anonymous

          What about compositors? Is it possible to use xcb to make a compositor that works with mixed refresh rates?

          • 2 years ago
            Anonymous

            It should be. I don't really use compositors, but for example the TearFree option (which is basically compositing just on the driver level) and glxgears work correctly.

          • 2 years ago
            Anonymous

            Weird thing I've noticed about TearFree is that just enabling it can sometimes make compositors work correctly (i.e. no longer be capped to the lowest refresh rate). I wonder what the mechanism behind that is.

            Anyway this is actually mostly irrelevant to me. My main motivation behind wanting to use separate X screens is VRR. There doesn't seem to be any way to make it work with Xinerama and Wayland is still like 5+ years away from being usable...

          • 2 years ago
            Anonymous

            Xinerama is definitely too old for this. I'm Randr is more than a complete replacement but some people still have xinerama around in their code anyone. The best bet for fixing this is probably patching xorg. I doubt it is somehow impossible since there is already some sort of VRR support. Perhaps randr needs to be extended or something like that, but it should be doable.

      • 2 years ago
        Anonymous

        x11 has many different ways to synchronize to display. With the simple old ones it syncs to the primary display, which means it may run at a higher fps but it still doesn't have tearing. With the newer sync functions you can sync to a specific framerate which you can get by just checking the framerate of the display the window is on. This option also allows you to handle frame overflows to make vsync perform smoother than traditional sync options. You can also always enable vsync but sleep to fit the vsync of the display it's on. x11 makes that all possible. This is the method serious games use for more consistent frame times (even on windows).

      • 2 years ago
        Anonymous

        Most applications use a very simple vsync method that syncs to the primary display but it's easy to fix in xorg server. I can fix it, but you need to wait until I buy a VRR display to test it on lol

        • 2 years ago
          Anonymous

          I have a VRR display but only one monitor kek. I attempted to connect my laptop to this monitor, and it wasn't able to detect that it was VRR capable. That might just be a the limitation of the HDMI I was using though since it wasn't even able to run it at 4k 60fps (only 30fps).

    • 2 years ago
      Anonymous

      XCB isn't needed. This behavior is usually controlled by GLX/EGL or Vulkan.

      >Mixed refresh rates actually work perfectly fine with randr and xcb.
      Do applications actually use the correct refresh rate on all displays? Do they not just always run at the refresh rate of the primary display?

      Applications that only cover one monitor will synchronize to the refresh rate of that monitor. The issue is compositors that create one window covering all monitors. If they created separate windows for every monitor I'm pretty sure mixed refresh rates would work, but none of them do.

      (At least this behavior applies to Mesa, who knows what NVIDIA is up to with their driver)

  2. 2 years ago
    Anonymous

    I'm using two monitors, main one with 144hz and second with 60hz. Worked out of the box with xrandr by just setting the first to 144hz. Might have to mess with vsync settings a bit in order to avoid screen tearing but I haven't had any huge issues as of yet.

  3. 2 years ago
    Anonymous

    I did it for a while. Works well enough for me as I rarely move windows across monitors so I'm not losing much. Only reason I stopped is that awesome doesn't support it well. It kinda works if you run two WMs, but it's still annoying in some ways.

    >bad handling of mixed refresh rates
    Works on my machine.

    • 2 years ago
      Anonymous

      >but it's still annoying in some ways
      What are some of the issues that you ran into?

      • 2 years ago
        Anonymous

        For example, switching focus between monitors was impossible without a xdotool hack and even then it was a bit buggy. Also setting up different tags on both monitors was a pain among other things I don't remember any more.
        Notably, awesome DID support separate X screens before, but it was dropped after version 4.1 or something. Maybe it could be patched back in, but I just went with xrandr instead.

        • 2 years ago
          Anonymous

          >switching focus between monitors was impossible without a xdotool hack and even then it was a bit buggy.
          I solved this by writing my own little C program with Xlib. It's working nicely so far.
          >Also setting up different tags on both monitors was a pain among other things I don't remember any more.
          This would probably be easy to patch into something like dwm, but I guess if you're using a more complicated WM that might be hard to do.

          • 2 years ago
            Anonymous

            >I solved this by writing my own little C program with Xlib. It's working nicely so far.
            not him, but can you share it anon senpai?

          • 2 years ago
            Anonymous

            https://pastebin.com/CDpTGF7i

          • 2 years ago
            Anonymous

            thanks tbhu ill try it out later

  4. 2 years ago
    Anonymous

    I stumbled upon this way or multi-monitor when I decided to put an an older video card to use in my set-up. Both monitors have the same refresh rate, so this hasn't been an issue.
    It kind of feels like having two computers running, if that makes sense. You can't move windows between the two monitors, and this even goes so far as to not being able to open programs on both monitors at once. I found myself using two web browsers so I could have internet information on both monitors.
    But, this also allows me have a monitor swap between workspaces independent of the other, which I think will be useful for reference.

  5. 2 years ago
    bruce3434

    >Xorg
    This isn't 2003 anymore

  6. 2 years ago
    Anonymous

    tbh nvidia just werks, i dont have this problem on my 1080

    • 2 years ago
      Anonymous

      Do you know if variable refresh rate on multi-monitor setups work on nvidia?

      • 2 years ago
        Anonymous

        yea it does

        • 2 years ago
          Anonymous

          Fricking hell. Mesa/xorg bros pls fix

      • 2 years ago
        Anonymous

        It doesn't.

        tbh nvidia just werks, i dont have this problem on my 1080

        yea it does

        Why the frick are you lying?

  7. 2 years ago
    Anonymous

    I used to, mostly because I had four screens across two separate cards and couldn't make it work any other way. I strongly suspect that the reason I couldn't make it work any other way boiled down to "Nvidia are shitheads", multi-card works fine now with amdgpu, nouveau, and intel. VRR wasn't a thing back then so I never tried that.

    Never, ever, under any circumstance, get Nvidia anything.

    • 2 years ago
      Anonymous

      >multi-card works fine now with amdgpu, nouveau, and intel
      No it doesn't. Xinerama completely disables 3D acceleration and copies everything to each card.

      • 2 years ago
        Anonymous

        >Xinerama
        found your problem

        • 2 years ago
          Anonymous

          That's the only way to get multi-card on Xorg, unless you use Nvidia's Mosaic shit which only works on Quadros.

          • 2 years ago
            Anonymous

            no it's not. I'm using two nvidia GPUs with a different monitor on each X screen/GPU, no Xinerama involved (i.e. can't drag windows between screens)

            https://i.imgur.com/6OhOpEf.png

            Anyone here unironically using Xorg with separate X screens?
            It allows you to avoid the limitations of Xinerama such as bad handling of mixed refresh rates and VRR not working with multiple displays active.
            I'm curious how many people are using such setup.

            I'm using it but it seems to break gsync for me even without Xinerama, also there is a Qt6 bug where apps always start on the wrong monitor. Apparently multi-screen X is something so few people use that they don't actually test it.

          • 2 years ago
            Anonymous

            I'm talking about without using multiple X screens. Read the post I'm replying to

            I used to, mostly because I had four screens across two separate cards and couldn't make it work any other way. I strongly suspect that the reason I couldn't make it work any other way boiled down to "Nvidia are shitheads", multi-card works fine now with amdgpu, nouveau, and intel. VRR wasn't a thing back then so I never tried that.

            Never, ever, under any circumstance, get Nvidia anything.

          • 2 years ago
            Anonymous

            Randr has supported that for like 10 years.

          • 2 years ago
            Anonymous

            Randr is a protocol, none of those drivers support a screen spread across multiple GPUs without Xinerama.

          • 2 years ago
            Anonymous

            It's implemented in xorg homie. Don't get smart with me. Your use case has been supported since 1.4.
            https://cgit.freedesktop.org/xorg/proto/randrproto/tree/randrproto.txt?id=randrproto-1.4.0#n144

          • 2 years ago
            Anonymous

            Linked the wrong line technically but whatever
            https://cgit.freedesktop.org/xorg/proto/randrproto/tree/randrproto.txt?id=randrproto-1.4.0#n150

          • 2 years ago
            Anonymous

            Linked the wrong line technically but whatever
            https://cgit.freedesktop.org/xorg/proto/randrproto/tree/randrproto.txt?id=randrproto-1.4.0#n150

            It's not my use case, I'm just correcting misinformation.

            >It's implemented in xorg
            Link to the code then, or you're just making shit up.

          • 2 years ago
            Anonymous

            >implying xorg doesn't implement randr
            uh anon what? This is the commit where they bumped the version (we're actually on 1.5 now). There's stuff before that where you can see them implementing this.
            https://github.com/freedesktop/xorg-xserver/commit/c1602d1c17967bdd4db9db19b3a9c0dfca6a58aa
            https://github.com/freedesktop/xorg-xserver/commits/master/randr

          • 2 years ago
            Anonymous

            Post the commit where "multiple GPU rendering" is implemented then.

          • 2 years ago
            Anonymous

            It's multiple commits, but it looks like it should be this series of changes that did it.
            https://github.com/freedesktop/xorg-xserver/compare/3cbc4c10b52896324fe14d2ab56bd54577c9294c...c41922940adbc8891575b3321fadf01ff4cb5854

          • 2 years ago
            Anonymous

            Those are all for PRIME support, I.e. two screens where one driver renders (source) and the other displays (sink). It's not multi-GPU rendering nor multiple GPUs on one screen.

          • 2 years ago
            Anonymous

            >It's not multi-GPU rendering nor multiple GPUs on one screen
            That's also PRIME, or PRIME render offload specifically.
            See: It's not multi-GPU rendering nor multiple GPUs on one screen
            >We want to render applications on the more powerful card and send the result to the card which has display connected.

      • 2 years ago
        Anonymous

        >Xinerama completely disables 3D acceleration and copies everything to each card.
        1. That depends on implementation. It worked fine with the Nvidia driver, for example.
        2. No one uses Xinerama any more.
        3. Copying buffers doesn't mean disabled 3D acceleration.

  8. 2 years ago
    Anonymous

    i used to run a separate x server to run games in 15 years ago

  9. 2 years ago
    Anonymous

    I do. I use my laptop in a docked configuration. The external monitor is the primary scree. The laptop's display runs a second X screen with an instance of suckless' tabbed and a small progran Ibwrote to automatically reparent windiws to that instance of tabbed. It also monitors how manny tabs there are, and it there are none, it turns the monitor off (it also turns it back on should a new windiw get mapped). Additionally it handles a few keybindings.
    I mostly use it for a terminal when I want to start something that takes a long time, but which I want to monitor like a download or an update. I also use it if I want to keep some documentation open, or watch a video in the background.
    The downside I have noticed is that some programs using glx just crash on the side monitor (screen 1) while they work on the main one (screen 0). I think it's related to using them both with the same gpu in a zaphodheads configuration.

    • 2 years ago
      Anonymous

      >The downside I have noticed is that some programs using glx just crash on the side monitor (screen 1) while they work on the main one (screen 0). I think it's related to using them both with the same gpu in a zaphodheads configuration.
      This shit is not tested at all any more, so bugs like that wouldn't surprise me.

      Randr is a protocol, none of those drivers support a screen spread across multiple GPUs without Xinerama.

      Yes, they do. It's part of RandR 1.4.
      >--setprovideroutputsource provider source
      > Set source as the source of display output images for provider. This is only possible if source and provider have the Source Output and Sink Output capabilities, respectively. If source is
      > 0x0, then provider is disconnected from its current output source.
      > --setprovideroffloadsink provider sink
      > Set provider as a render offload device for sink. This is only possible if provider and sink have the Source Offload and Sink Offload capabilities, respectively. If sink is 0x0, then provider
      > is disconnected from its current render offload sink.
      Even the Nvidia driver lets you do this through xrandr nowadays.

      Post the commit where "multiple GPU rendering" is implemented then.

      It's part of PRIME/reverse PRIME and there's also render offloading.
      See: https://wiki.archlinux.org/title/PRIME#Reverse_PRIME

      • 2 years ago
        Anonymous

        >Yes, they do. It's part of RandR 1.4.
        Nothing you quoted supports that. Render offload works by creating at least two screens, one being a GPU screen.

        • 2 years ago
          Anonymous

          You have PRIME (render offload) and reverse PRIME (using another GPU as dumb output pipe).
          What exactly are you missing?

          • 2 years ago
            Anonymous

            What started this moronic conversation in the first place -- a singular X screen for multiple GPUs and multi-GPU rendering. Randr offloading is neither of those: there's >=2 screens and the rendering work isn't distributed across them.

          • 2 years ago
            Anonymous

            Just attach multiple GPUs to the same screen

          • 2 years ago
            Anonymous

            And once again the only way of doing that is with Xinerama or Nvidia mosaic. Thanks for playing you lost.

          • 2 years ago
            Anonymous

            >You have a single X screen with as many outputs from multiple GPUs as you want
            false
            >plus you can offload rendering if you like.
            irrelevant
            >GPU screens are an Nvidia thing, plus they're just virtual render targets. They don't affect how anything is actually displayed.
            No it's how randr offloading works on ALL drivers. From one of the commits previously mentioned
            https://github.com/freedesktop/xorg-xserver/commit/9d179818293b466ec6f1777f0b792e1fbbeb318c
            What you want to personally call them is irrelevant. They're still X screens to the server with their own ScreenRec.
            >You can select which GPU to use for rendering at runtime with the DRI_PRIME variable.
            Which isn't multi-GPU rendering, dumbass it's only one GPU. If glxgears covers multiple monitors on different GPUs every pixel is still rendered by exactly the same single GPU.

            Just use multiple providers and put them on the same output. Isn't this what you want? Or do you think you can't do this for some reason?

          • 2 years ago
            Anonymous

            >Isn't this what you want?
            No read again.

            Here you can see how there is only one X screen in a dual GPU setup: https://wiki.gentoo.org/wiki/AMDGPU#Xrand_doesn.27t_see_HDMI_port_with_hybrid_system
            >iGPU: eDP
            >dGPU: HDMI-A-1-0
            Both outputs from two different GPUs attached to the same X screen.
            >Which isn't multi-GPU rendering, dumbass it's only one GPU. If glxgears covers multiple monitors on different GPUs every pixel is still rendered by exactly the same single GPU.
            What the frick do you expect? That glxgears renders on all GPUs at the same time like some frankenstein SLI? You can choose to use either the iGPU, or the dGPU. Or the weaker of two dGPUs, whatever.

            >Here you can see how there is only one X screen in a dual GPU setup
            What's this supposed to prove? Put an ErrorF in init_screen(). xrandr doesn't loop over every screen, it just shows the screen number of whatever was handed to it either as an optarg or the default screen from xlib.
            >Both outputs from two different GPUs attached to the same X screen.
            Yeah an output not a whole card. Try again.
            >What the frick do you expect? That glxgears renders on all GPUs at the same time like some frankenstein SLI? You can choose to use either the iGPU, or the dGPU. Or the weaker of two dGPUs, whatever.
            What do you think the definition of multi-GPU rendering is? What you're talking about is just glorified GPU switching.

          • 2 years ago
            Anonymous

            >What's this supposed to prove? Put an ErrorF in init_screen(). xrandr doesn't loop over every screen, it just shows the screen number of whatever was handed to it either as an optarg or the default screen from xlib.
            It proves you can use multiple GPUs on a single screen.
            >Yeah an output not a whole card. Try again.
            What does "whole card" mean?
            >What do you think the definition of multi-GPU rendering is? What you're talking about is just glorified GPU switching.
            Oh wow, way to move the goalposts.
            First you were like this
            >That's the only way to get multi-card on Xorg, unless you use Nvidia's Mosaic shit which only works on Quadros.
            >Randr is a protocol, none of those drivers support a screen spread across multiple GPUs without Xinerama.
            Which are both wrong.
            Now you also want heterogeneous multi GPU rendering. Wow. But hey, you can do that by creating multiple GL/Vulkan contexts for each GPU. It requires explicitly handling it by the program (unless you have a literal SLI setup), but you can do it. This has nothing to do with Xorg, though.

          • 2 years ago
            Anonymous

            >It proves you can use multiple GPUs on a single screen.
            This is called "fallacy of composition". An RROutput is an abstraction of a display output, not a GPU.
            >What does "whole card" mean?
            a graphics card
            >Which are both wrong.
            your denial is impressive
            >Oh wow, way to move the goalposts.
            This was always my goalpost...

            Post the commit where "multiple GPU rendering" is implemented then.

            >Now you also want heterogeneous multi GPU rendering. Wow. But hey, you can do that by creating multiple GL/Vulkan contexts for each GPU. It requires explicitly handling it by the program (unless you have a literal SLI setup), but you can do it. This has nothing to do with Xorg, though.
            You didn't answer my question. Let me try again: what definition of multi-GPU rendering are you using?

          • 2 years ago
            Anonymous

            >This is called "fallacy of composition". An RROutput is an abstraction of a display output, not a GPU.
            All this shit is abstractions to do specific things.
            >a graphics card
            You have the outputs and can use it for rendering. What is missing?
            >This was always my goalpost...
            You are this anon, right?:

            >multi-card works fine now with amdgpu, nouveau, and intel
            No it doesn't. Xinerama completely disables 3D acceleration and copies everything to each card.

            That's the only way to get multi-card on Xorg, unless you use Nvidia's Mosaic shit which only works on Quadros.

            Randr is a protocol, none of those drivers support a screen spread across multiple GPUs without Xinerama.

            This is the anon you were originally replying to who mentioned "multi-card":

            It doesn't.
            [...]
            [...]
            Why the frick are you lying?

            >multi-card works fine now with amdgpu, nouveau, and intel
            Which in this context obviously means PRIME/reverse PRIME, not what you are talking about. Which is possible, but not transparently unless you are using SLI. Vulkan explicitly supports it.

          • 2 years ago
            Anonymous

            >All this shit is abstractions to do specific things.
            things can be different
            >You have the outputs and can use it for rendering. What is missing?
            rendering distributed across multiple gpus with only one X screen
            >Which in this context obviously means PRIME/reverse PRIME, not what you are talking about.
            Fine, sorry. It wasn't obvious to me since that still needs a screen per GPU to work. But I see now that by screen you actually mean non-GPU screen.

          • 2 years ago
            Anonymous

            >a singular X screen for multiple GPUs and multi-GPU rendering.
            That's what these xrandr options do. You have a single X screen with as many outputs from multiple GPUs as you want, plus you can offload rendering if you like.
            >Randr offloading is neither of those: there's >=2 screens
            GPU screens are an Nvidia thing, plus they're just virtual render targets. They don't affect how anything is actually displayed.
            >and the rendering work isn't distributed across them.
            You can select which GPU to use for rendering at runtime with the DRI_PRIME variable.

          • 2 years ago
            Anonymous

            >You have a single X screen with as many outputs from multiple GPUs as you want
            false
            >plus you can offload rendering if you like.
            irrelevant
            >GPU screens are an Nvidia thing, plus they're just virtual render targets. They don't affect how anything is actually displayed.
            No it's how randr offloading works on ALL drivers. From one of the commits previously mentioned
            https://github.com/freedesktop/xorg-xserver/commit/9d179818293b466ec6f1777f0b792e1fbbeb318c
            What you want to personally call them is irrelevant. They're still X screens to the server with their own ScreenRec.
            >You can select which GPU to use for rendering at runtime with the DRI_PRIME variable.
            Which isn't multi-GPU rendering, dumbass it's only one GPU. If glxgears covers multiple monitors on different GPUs every pixel is still rendered by exactly the same single GPU.

          • 2 years ago
            Anonymous

            Here you can see how there is only one X screen in a dual GPU setup: https://wiki.gentoo.org/wiki/AMDGPU#Xrand_doesn.27t_see_HDMI_port_with_hybrid_system
            >iGPU: eDP
            >dGPU: HDMI-A-1-0
            Both outputs from two different GPUs attached to the same X screen.
            >Which isn't multi-GPU rendering, dumbass it's only one GPU. If glxgears covers multiple monitors on different GPUs every pixel is still rendered by exactly the same single GPU.
            What the frick do you expect? That glxgears renders on all GPUs at the same time like some frankenstein SLI? You can choose to use either the iGPU, or the dGPU. Or the weaker of two dGPUs, whatever.

  10. 2 years ago
    Anonymous

    Imagine using loonix on desktop...

  11. 2 years ago
    Anonymous

    i3 doesn't support it

    • 2 years ago
      Anonymous

      Support what exactly? This is not really window manager dependent at all.

      • 2 years ago
        Anonymous

        Each instance can only run on one X screen

        • 2 years ago
          Anonymous

          And that should work with i3 if you really want to do that instead of just using xrandr. It's not window manager dependent.

          • 2 years ago
            Anonymous

            hes saying he would need to run 2 i3 instances, meaning he can't move windows between both, or share the workspaces

  12. 2 years ago
    Anonymous

    How do i set it up?

Your email address will not be published. Required fields are marked *