Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not the GP, but my current dream monitor would be a 3:2 or 16:10 OLED in the 24"-27" range with roughly 200 PPI and 120 Hz, preferably slightly curved, with hardware calibration for at least sRGB gamut. There’s nothing close to that in the market.


sRGB is a very modest goal.

I just got an RGB OLED laptop with a gamut significantly wider than Display P3. It's just glorious. UHD content like 4K movies just pop in a way that you have to see in person. It's especially noticeable on military uniforms, where the various shades of dark green are much more distinct than on a typical monitor.


My priority is color accuracy, via hardware calibration (LUT) (no loss of gradations by OS-level or GPU-level mappings). I’d rather have an accurate sRGB display than a not-quite-accurate P3 (or, worse, "natural" wide gamut) display. Also, to display sRGB images (still the large majority of what’s out there) accurately on a wide-gamut system, you need 10-bit color depth at the OS/GPU level to not lose/distort gradations.


Most wide gamut displays are 10-bit per channel, which makes them accurate enough even with software calibration.

Most also have 14-bit LUTs in hardware.


It's not sufficient for the display to be 10-bit, the OS and/or GPU (where the software calibration mapping takes place) must also work with 10 bits, and when graphics from different color spaces are combined on screen (UI graphics, images displayed, etc.), the OS must correctly map the source color space to the 10-bit output color space. All of that working correctly is not common-place yet.

Therefore, for dev work and dev-related UI graphics, I prefer to work in a calibrated "least common denominator" 8-bit sRGB space, because that's much easier to get right. However, in order to not lose color gradations to calibration, hardware calibration is then preferable.


It is common-place-ish.

Windows since Vista can use 16-bit float buffers for the desktop manager. Some applications support this too for all controls and UI elements. Desktop graphics applications support 10-bit, such as Photoshop. Similarly, video playback is generally 10-bit.

In the past, this feature was reserved for the ludicrously expensive "professional" GPUs like the Quadro series, but it has been enabled in software for the mainstream AMD and NVIDIA GPUs. Very recently (just months ago?) my Intel GPU gained 10-bit output capability even in SDR mode.

It definitely works, I used "test pattern" videos and test images in Photoshop, and even dark grey-to-grey gradients are silky smooth on two of my monitors. This includes a 7-year-old Dell monitor!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: