Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

HDR still doesn't really work on Linux w/ nVidia GPUs.

1. 10bpp color depth is not supported on RGB monitors, which are the majority of LCD displays on the market. Concretely, ARGB2101010 and XRGB2101010 modes are not supported by current nVidia Linux drivers - the drivers only offer ABGR2101010 and XBGR2101010 (See: https://github.com/NVIDIA/open-gpu-kernel-modules/blob/main/...).

2. Common browsers like Chrome and Firefox has no real support for HDR video playback on nVidia Linux drivers. The "HDR" option appears on YouTube, but no HDR color can be displayed with an nVidia GPU.

Also, video backgrounds in Google Meet on Chrome are broken with nVidia GPUs and Wayland. Ironically it works on Firefox. This has been broken for a few years and no fix is in sight.

The "HDR" toggle you get on Plasma or Mutter is hiding a ton of problems behind the scenes. If you only have 8bpp, even if you can find an app that somehow displays HDR colors on nVidia/Wayland - you'll see artifacts on color gradients.



I have Interstellar on 4K UltraHD Blu-ray that features HDR on the cover, Sony 4K Blu-ray player (UBP-X700) and a LG G4 OLED television. I also have an AVR (Denon AVR-S760H 7.2 Ch) connecting both the Blu-ray and a PC running Linux with a RTX 3060 12GB graphic card to the television. I've been meaning to compare HDR on Linux with the Blu-ray. I guess now better than never. I'll reply back to my post after I am done.


Try it with different monitors you have. The current nVidia Linux drivers only has BGR output for 10bpp, which works on TVs and OLEDs but not most LCDs monitors.

My monitors (InnoCN 27M2V and Cooler Master GP27U) require RGB input, which means it's limited to 8bpp even with HDR enabled on Wayland. There's another commentator below who uses a Dell monitor and manages to get BGR input working and full HDR in nVidia/Linux.


I connected two portable LCDs I have that support HDR. Both LCDs didn't automatically detect HDR and looked washed out initially. I had to manually change them to HDR. The signal according to the AVR was...

  Resolution: 4K60

  HDR: HDR10

  Color Space: YCbCr 4:4:4/BT.2020

  Pixel Depth: 12bits

  FRL Rate: ---
...for both the LCDs.

Here are their specs:

  https://www.amazon.com/dp/B09Q5L245X 

  https://www.amazon.com/dp/B08131MVGT
With the HDR off for both the desktop and LCD, the Youtube HDR video at 19s seems flat. I could increase the monitor's brightness to match the planet brightness when HDR is on, but space would be washed out. Of course without HDR, lowering the brightness for darker space results in the planet becoming darker too.

When HDR is off for LCD and desktop I do still see a difference between Youtube's HDR and SDR videos. For example, at the 19s mark I cannot see most of the debris scattering between the viewer and the planet in the SDR video. That should be the case for you too.

*Edit: Strange... one of the monitors states 10bit colors in the link even though the AVR claimed a signal of 12bits. Not sure what to make of that!


I'll look into that tomorrow. See my other comment for Linux vs Blu-ray.


Television HDR mode is set to FILMMAKER, OLED brightness 100%, Energy Saving Mode is off. Connected to AVR with HDMI cable that says 8K.

  PC has Manjaro Linux with RTX 3060 12GB

  Graphic card driver: Nvidia 580.119.02

  KDE Plasma Version 6.5.4

  KDE Frameworks Version: 6.21.0

  Qt Version: 6.10.1

  Kernel Version 6.12.63-1-MANJARO

  Graphics Platform: Wayland
Display Configuration

  High Dynamic Range: Enable HDR is checked

  There is a button for brightness calibration that I used for adjustment.

  Color accuracy: Prefer color accuracy

  sRGB color intensity: This seems to do nothing (even after apply). I've set it to 0%.
  Brightness: 100%
TV is reporting HDR signal.

AVR is reporting...

  Resolution: 4KA VRR

  HDR: HDR10

  Color Space RGB /BT.2020

  Pixel Depth: 10bits

  FRL Rate 24Gbps
I compared Interstellar 19s into Youtube video in three different ways on Linux and 2:07:26 on Blu-ray.

For Firefox 146.0.1 by default there is no HDR option on Youtube. 4K video clearly doesn't have HDR. I enabled HDR in firefox by going to about:config and setting the following to true: gfx.wayland.hdr, gfx.wayland.hdr.force-enabled, gfx.webrender.compositor.force-enabled. Color look completely washed out.

For Chromium 143.0.7499.169 HDR enabled by default. This looks like HDR.

I downloaded the HDR video from Youtube and played it using MPV v0.40.0-dirty with settings --vo=gpu-next --gpu-api=vulkan --gpu-context=waylandvk. Without these settings the video seems a little too bright like the Chromium playback. This was the best playback of the three on Linux.

On the Blu-ray the HDR is Dolby Vision according to both the TV and the AVR. The AVR is reporting...

  Resolution: 4k24

  HDR: Dolby Vision

  Color Space: RGB

  Pixel Depth 8bits

  FRL Rate: no info
...I looked into this and apparently Dolby Vision uses RGB tunneling for its high-bit-depth (12-bit) YCbCr 4:2:2 data. The Blu-ray looks like it has the same brightness range but the color of the explosion (2:07:26) seems richer compared to the best playback on Linux (19s).

I would say the colors over all look better on the Blu-ray.

I might be able to calibrate it better if the sRGB color setting worked in the display configuration. Also I think my brightness setting is too high compared to the Blu-ray. I'll play around with it more once the sRGB color setting is fixed.

*Edit: Sorry Hacker News has completely changed the format of my text.


Thank you, this is very valuable.


I don't think the Interstellar Blu-ray has Dolby Vision (or Dolby Atmos), just regular HDR10. If the TV/AVR says it's Dolby Vision something in your setup might be doing some kind of upconversion.


You're right! It looks like the Sony UBP-X700 doesn't automatically detect the HDR type and was set to Dolby Vision. I turned it off and the TV now displays the same HDR logo it shows when connecting to the PC. The AVR says...

  Resolution: 4K24

  HDR: HDR10

  Color Space: YCbCr 4:4:4/BT.2020

  Pixel Depth: 12bits

  FRL Rate: ---
...color are now more aligned with the PC. The Blu-ray video seems to be showing more detail in the explosion. I thought this extra detail was because of more color being shown, but I now think this might have something to do with Youtube's HDR video being more compressed.


I find that running HDR games in standalone steam gamescope works great for my OLED tv. Not perfect, but great.


I don't think this is true. I can go into my display settings in kde plasma and enable HDR and configure the brightness. I have a nvidia blackwell card.


You can enable, yes. But (assuming you're on an LCD display and not an OLED), you're likely still on XRGB8888 - i.e. 8-bit per channel. Check `drm_info`.

Also, go to YouTube and play this video: https://www.youtube.com/watch?v=onVhbeY7nLM

Do it once on "HDR" on Linux, and then on Windows. The "HDR" in nVidia/Linux is fake.

The brightness you see on Plasma or Mutter is indeed related to the HDR support in the driver. But - it's not really useful for the most common HDR tasks at the moment.


I asked claude to investigate:

  Your Display Configuration

  Both monitors are outputting 10-bit color using the ABGR2101010 pixel format.

  | Monitor                | Connector | Format      | Color Depth | HDR          | Colorspace |
  |------------------------|-----------|-------------|-------------|--------------|------------|
  | Dell U2725QE (XXXXXXX) | HDMI-A-1  | ABGR2101010 | 10-bit      | Enabled (PQ) | BT2020_RGB |
  | Dell U2725QE (XXXXXXX) | HDMI-A-2  | ABGR2101010 | 10-bit      | Disabled     | Default    |

* Changed the serial numbers to XXXXXXX

I am on Wayland and outputting via HDMI 2.1 if that helps.

EDIT: Claude explained how it determined this with drm_info, and manually verified it:

> Planes 0 and 3 are the primary planes (type=1) for CRTCs 62 and 81 respectively - these are what actually display your desktop content. The Format: field shows the pixel format of the currently attached framebuffer.

EDIT: Also note that I am slowbanned on this site, so may not be able to respond for a bit.

EDIT: You should try connecting with HDMI 2.1 (you will need a 8k HDMI cable or it will fall back to older standards instead of FRL).

EDIT: HDR on youtube appears to work for me. Youtube correctly indentifies HDR on only 1 of my monitors and I can see a big difference in the flames between them on this scene: https://www.youtube.com/watch?v=WjJWvAhNq34


I don't have a Dell U2725QE, but on InnoCN 27M2V and Cooler Master GP27U there's no ABGR2101010 support. These monitors would only work with ARGB2101010 or XRGB2101010 which nVidia drivers do not provide.

Here's what I'm getting on both monitors, with HDR enabled on Gnome 49: https://imgur.com/a/SCyyZWt

Maybe you're lucky with the Dell. But as I understand, HDR playback on Chrome is still broken.


HDR playback in chrome on KDE works as expected from what I can tell. For GNOME 49.2 it does not, it doesn't get the luminance that it should at this time. 49.3 may fix this.


Ok. I've been using DisplayPort 1.4a with my 4090 at the moment. Maybe I'll try HDMI 2.1 and see what happens.

I'm actually surprised that YouTube HDR works on your side - perhaps it's tied to the ABGR2101010 output mode being available.


No luck for me with HDMI 2.1 - still seeing XRGB8888 on my monitors after HDR enabled.

That's still pretty crappy. Monitors do not say whether they support BGR input signals or not as opposed to RGB.


Was it an 8k cable? Are you on wayland?


I'm on Wayland and the cable is HDMI 2.1 ultra high speed, which means 8k. Xorg is already gone on Ubuntu 25.10.

The GPU and monitor combination has full 10-bit HDR in Windows. But in Linux it's stuck at 8bpp due to nVidia driver not having 10-bit RGB output.


I don’t think your problem is RGB instead of BGR. That’s just the compositor’s work area and your monitor never sees it (it includes an alpha channel). Have you tried KDE Plasma? It sounds like KWin uses 10-bit planes by default when available. Maybe Ubuntu’s compositor (Mutter?) doesn’t support 30 bit color or must be configured? Or maybe you need the nvidia driver >= 580.94.11 for VK_EXT_hdr_metadata (https://www.phoronix.com/news/NVIDIA-580.94.11-Linux-Driver)


It's not obvious how to interpret the output. I pasted it into chatgpt and it thinks I am using "Format: ABGR2101010" for both monitors (only 1 has HDR on) so I don't trust it.

EDIT: See my sibling comment.


Under the Planes section, look for planes that have non-zero "CRTC_ID". Those are the planes that actually get output to your monitor.

Here's what I'm getting on an RTX 4090 / InnoCN 27M2V and Cooler Master Tempest GP27U.

https://imgur.com/a/SCyyZWt


nvidia


Right, it IS nvidia's fault at this point, but its still like what? 90% of the consumer GPU market.


Funny how it went from "just get an Nvidia card for Linux" and "oh my god, what did I do to deserve fglrx?" to "just get an AMD card" and "it's Nvidia, what did you expect?"


They're also selling $3000 nVidia AI workstations that exclusively uses Linux. But what if you want to watch an HDR video on it? No. What if you want to use Google Meet on Chrome/Wayland? It's broken.


For aftermarket purchase sure, but 95% of consumer machines are using either Intel or AMD integrated graphics.


The way it's not meant to be played.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: