Anyone got 5k, 6k, 8k display to work in dom0?

I have a 6k monitor and so far have only been able to get it to use 4k in qubes dom0

The GPU is an Intel Arc A310 Eco (uses i915) which despite being a single slot, half width card, has 8k capabilities, via its mini-dp or HDMI ports. The monitor is Dell U3224KB (6k)

I’m not going to get deep into the specific issues I’ve encountered this far as it isn’t my first rodeo and there’s plenty more troubleshooting to do before dragging someone else into it

So, consider this a survey- has anyone got 5k, 6k or 8k to work in dom0? With an open ir proprietary driver? AMD, Nvidia or Intel? Any tips/pitfalls? … I don’t care about acceleration, I just want the pixels. I have a 7965WX that should handle the software rendering without too much of a problem

Because the monitor cost as much as an F35 fighter jet with chrome rims, I’m willing to buy any GPU proven to work in such a configuration. Though I fear I may literally be the only person in the world with this monitor using Qubes

For those insisting on knowing what I’ve tried

  1. Tried setting the Xorg intel/i915 and modesetting drivers, in an xorg config file section. Highest resolution either picked up is 4k
  2. Tried forcing a modeline specific to the resolution- via in Xorg config and also via xrandr/arandr (vague error about crt fail)
  3. Was going to try forcing Xorg to use a 6k EDID file but I haven’t been able to find or generate one (found a script that generates the blobs but it complained about the aspect ratio)

What I have not tried:

  1. The obvious, see if it works in Ubuntu, with vanilla and out of tree i915 drivers
  2. See if it works with Windows
  3. See if I can get a Good Enough solution by rigging up Xinerama or Zaphodheads, by creating two logical screens on the single physical display
  4. Buying an AMD or Nvidia GPU…

My next post in some days/weeks will include details in case anyone has more experience with this and is willing to lend a hand…

Did you connect your GPU to the monitor using HDMI or mini-DP?

  • Max Resolution (HDMI) 4096 x 2160@60Hz
  • Max Resolution (DP) 7680 x 4320@60Hz

https://ark.intel.com/content/www/us/en/ark/products/227958/intel-arc-a310-graphics.html

1 Like

I have 3 displays daisy-chained using MST, my Screen0 in dom0 is 7680x1440 pixel.

I drive it using onboard Intel iGPU (13th gen UHD Graphics 770) from the display port, it supports resolution up to 7680x4320 @ 60Hz, and 5120x3200 @ 120Hz.

It doesn’t work with HDMI, it only supports 4096x2160 @ 60 Hz.

1 Like

Good question. And thank you for the response, I’m going to take a look at the link momentarily and hopefully return with a “That worked!”

Some additional info I should have included- Intel Arc A310 Eco (it’s the one made by Sparkle) has 2 DP ports and an HDMI port

So, I’ve tried:

  • Single 8k and (to be safe) 16k HDMI cables from Cable Matters
  • A single 8k MiniDP cable made by Anker

Same results with both - 3840x2160@60hz

However, I should mention that I have a lot of eggs in two remaining baskets that I plan to try:

First, the WRX90 board has 2 MiniDP IN ports, and two 40Gbps USB-C OUT ports, intended to be used as GPU->MiniDP 1, GPU->MiniDP2, then USB-C to monitor. I’m hoping that will work, as the WRX90 gods seemingly intended, and may be pleasing to the Dell gods, by extension, pleasing Xorg’ ok

If that doesn’t work, for completeness, I plan to try to connect both a miniDP and HDMI from the A310 to the monitor. I’m doubtful about this one, but it’s worth a try

It would be more natural to connect a miniDP to DP and miniDP on the monitor simultaneously, but the monitor only has one miniDP port, one HDMI port, and one USB-C port. That’s it

I read the manuals of the A310 Eco, the Dell monitor, and the WRX90 board before buying the GPU and monitor, but plan to do so once more today, to see if I misread or misunderstood. From what I recall of the A310, both the HDMI or the miniDP alone were capable of providing 8k. :confused:

As a side note: It’s extremely difficult to find brand name miniDP Male ↔ miniDP Male cables. At least on Amazon and Newegg

I haven’t tried the options above yet because I didn’t have the necessary cables, they should arrive this week :crossed_fingers: :prayer_beads:

This is good news. Am I right in assuming you used either the standard in-tree/upstream i915 module included with Qubes?

And are you using the modesetting Xorg driver or the intel/i915 Xorg driver?

Anything special in your Xorg config?

Ah thanks, definitely misremembered the Sparkle spec sheet, which I swear said 8k with DP or HDMI, but as your link says (and the Sparkle Spec Sheet says) HDMI is indeed only 4k

So maybe there is hope with mDP, mDP → WRX90 mDP1, mDP2 with USB-C on WRX90 to the monitor

I’m ignorant of MST aside from knowing it exists. It sounds like maybe an mDP->MST hub that outputs a single USB-C may be an option. Perhaps that’s functionally equivalent to what the WRX90 offers with its unconventional A310 mDP1->WRX90 mDP1, A310 mDP2->WRX90 mDP2, with USB-C 40Gbps out

Thanks again for pointing this out

The only thing I did was generate the xorg.config, but I don’t know if that is actually needed.

I didn’t change any driver settings, it just worked out of the box.

Reading the Intel link, I also see this curious tidbit, which seems to be a subtle contradiction to the Sparkle spec sheet:

Sparkle says:

PCI Express Configurations: PCI Express 4.0 x8

Intel says:

PCI Express Configurations: Up to PCI Express 4.0 x8 (x16 slot required)

I have the card in the only slot on the board that is limited to x8, which seems like it may be an issue preventing a single mDP->mDP from providing 6k. I’ll check that out

And FFS, I feel awful that you read the F manual for me. I clearly didn’t do as good a job with that as I thought…

It means the card uses x8, but the connector is x16, you can only fit the card into an open-ended x8 or full x16 slot.

Derp, yeah, I realized they meant physical x16 after I swapped the card (which required me to remove two double slot GPUs first, including one which is very awkwardly yet precisely in place with a riser. Oh well, not a major problem

Anyway, it seems this is definitely not a Qubes issue, though I did consider the possibility that an update to the current version of the open-source Intel GPU drivers in Qubes 4.2 might help. Either i915 from Intel or the new Xe driver that is to supercede i915. But I don’t want to steal anymore of anyone’s time just yet

Progress

I got the GPU to report an 8K modeline in the Xorg log, and got the monitor to report 5120x2880 in the EDID data - better than before, where I was stuck at 3840x2160 for both. Progress!

I did this by focusing on only the DP connection, using the intel xorg driver, and setting some i915 driver parameters that I had already read about over the past week, specifically:

i915.alpha_support=1
i915.force_probe=*
^— using * because I’m too lazy to get the deviceid

I also grabbed updated GPU firmware/GUC/HUC blobs from the lintel-gpu-firmware Github repo, though I’m not certain that mattered

With those set, and a non-default DRI and HWAccel setting, Xorg now shows:

DP2 (disconnected)
DP3 (disconnected)
**DP2-8 (connected, max 5120x2180)**
HDMI1 (disconnected)
HDMI2 (disconnected)
**HDMI3 (connected, max resolution 3840x2160@60)**

It then says “Using spanning desktop for initial modes”, followed by a series of DRI and framebuffer errors after it sets both HDMI3 and DP2-8 to 3840x2160@60 each (which is too much for the monitor to handle, 7680x2160@60)

From there it loads the “fb” module, which allocates a 7680x2160 framebuffer, spits out 2 WW (warning) lines, each stating “cannot enable DRI2 whilst forcing software fallbacks” (good use of “whilst”!) and “Disabling XV because no adapters could be initialized”

Finally “failed to add fb” (also twice) and an error “AddScreen/ScreenInit failed for driver 0”, then a generic bombing out

As all that happens, the monitor goes black for a few seconds and ultimately returns to the terminal. This is plenty for me to work with. I figure I should next look into the following, but will accept other suggestions:

  • Test if a 40Gbps DSC mDP cable can resolve the problem (I believe the current mDP cable may be 32Gbps - which should be plenty, bandwidth-wise. But who knows ,in practice, for any given GPU/monitor combination)
  • Add a modeline explicitly for 6k on DP2-8 amd force it as the only/primary in my 20-intel.conf
  • Force the DP2-8 and HDMI into 1/2 of 6k each, then combine them into the single physical display via a logical Screen/ScreenLayout, maybe using some sort of inverse Xinerama configuration, if that’s a thing (planning to look into it, maybe it’s not possible…)

I’ll provide an update when this is resolved, in case it helps anyone else

Thanks once more for the note about the HDMI limitation and for confirmation that high resolution (8k) is definitely not an impossibility for dom0. It lit a fire under me to keep trying. As painful as it is, when the machine takes 2-3 minutes to POST, and the first hour was hard lockups or spontaneous reboots…

BTW, is there a way to have a moderator move this to “Hardware”, since User Support is probably not the right place for it?

Ok, this was pretty rough.

Had nothing to do with Qubes as expected. Mainly cables and new to the market and relatively uncommon GPU and monitor

In summary:

  1. 16k miniDP → DP cable and DP->miniDP adapter was required, because only miniDP ports were available on the A310 and the monitor. It’s surprising how hard it is to find a plain male mDP<->mDP 8k/16k cable. In theory, 8K should have sufficed
  2. The A310 GPU, being new, is not officially supported i915, so I had to use i915.force_probe=56a6
  3. Without enable_guc=3, the system was locking up completely. With enable_guc=3, Xorg was refusing to initialize the adapter
  4. I’m not completely certain now, but I think the GUC/HUC files may have been out of date- so I grabbed the ones from kernel.org
  5. Xorg also didn’t want to initialize the A310 unless it had early (at initramfs time) KMS- which meant adding the GUC/HUC blobs, i915 module and a small handful of related modules (mei, mei-xp, etc) had to be added to initramfs
  6. Had to explicitly disable MST with i915.enable_mst=0
  7. Had to generate an EDID file for the obscure 6144x3456 resolution, because it wasn’t reliably being sent by the display. This also had to be added to the initramfs
  8. Custom modeline, for xrandr
  9. Had to force the display port with video=DP-1:6144x3456 cmdline option
  10. 20-intel.conf for Xorg required VideoRam 256000 in the Device section and PrimaryGPU in the OutputClass section
  11. I needed to explicitly specify both the device name (DP-1) and the crtc number when setting the custom xrandr mode or it would fail. Maybe that’s a common thing, I wasn’t familiar with it
  12. Finally, the monitor had to be forced into DP1.4 mode using a long button press. I don’t know why it can’t figure this out on its own and I don’t care to find out at this point

Complicating matters, the system takes 2-3 minutes to reboot, occasionally froze during incremental tests, and was giving false results when manually stopping/starting lightdm between modifications of the Xorg configuration

What a nightmare. But i wouldn’t have figured it out without the assurance that it was possible here, and the DP tip helped a lot

Now I can enjoy thousands more pixels than before! And only take up a single slot with a half width and half-length card that only pulls 50w (and only cost $100USD)

Thanks!

1 Like