Jokes on you I use NVIDIA
*Cries*
Been watching this drama about HDR for a year now, and still can’t be arsed to read up on what it is.
HDR or High Dynamic Range is a way for images/videos/games to take advantage of the increased colour space, brightness and contrast of modern displays. That is, if your medium, your player device/software and your display are HDR capable.
HDR content is usually mastered with a peak brightness of 1000nits or more in mind, while Standard Dynamic Range (SDR) content is mastered for 80-100nit screens.
How is this a software problem? Why can’t the display server just tell the monitor "make this pixel as bright as you can (255) and this other pixel as dark as you can (0)?
In short: Because HDR needs additional metadata to work. You can watch HDR content on SDR screens and it’s horribly washed out. It looks a bit like log footage. The HDR metadata then tells the screen how bright/dark the image actually needs to be. The software issue is the correct support for said metadata.
I‘d speculate (I’m not an expert) that the reason for this is, that it enables more granularity. Even the 1024 steps of brightness 10bit colour can produce is nothing compared to the millions to one contrast of modern LCDs or even near infinite contrast of OLED. Besides, screens come in a number of peak brightnesses. I suppose doing it this way enables the manufacturer to interpret the metadata to look more favorably on their screens.
And also, with your solution, a brightness value of 1023 would always be the max brightness of the TV. You don’t always want that, if your TV can literally flashbang you. Sure, you want the sun to be peak brightness, but not every white object is as bright as the sun… That’s the true beauty of a good HDR experience. It looks fairly normal but reflections of the sun or the fire in a dark room just hit differently, when the rest of the scene stays much darker yet is still clearly visible.
HDR? Ah, you mean when videos keep flickering on Wayland!
I will switch when I need a new GPU.
Now that explicit sync has been merged this will be a thing of the past
videos? everything flickers for me on wayland. X.org is literally the only thing keeping me from switching back to windows right now.
OK but can you please call NVidiachan? I know you two don’t get along but maybe you can ask her for some support?
NVidiachan is busy selling GPUS for AI, but she is also working on adding explicit sync
If I understand correctly Nvidia isn’t doing anything to do with explicit sync, it just doesn’t support implicit sync which is currently what Wayland uses because we don’t have explicit sync yet. Explicit sync would work with existing Nvidia drivers.
I’m not touching Wayland until it has feature parity with X and gets rid of all the weird bugs like cursor size randomly changing and my jelly windows being blurry as hell until they are done animating
Not sure why your getting down voted, I wish I could switch, but only X works reliability.
have you tried plasma 6?
have you tried plasma 6?
You want to win me over? For starters, provide a layer that supports all hooks and features in
xdotool
andwmctrl
. As I understand it, that’s nowhere near present, and maybe even deliberately impossible “for security reasons”.I know about
ydotool
anddotool
. They’re something but definitely not drop-in replacements.Unfortunately, I suspect I’ll end up being forced onto Wayland at some point because the easy-use distros will switch to it, and I’ll just have to get used to moving and resizing my windows manually with the mouse. Over and over. Because that’s secure.
I think it’s possible to make such a tool for Wayland, but in Wayland stuff like that are completely on the compositor
So, ask the compositor developers to expose the required shit and you can make such a tool
Nah, I don’t Need HDR
Network transparency OR BUST
Sure, let me dust off my fucking SPARCStation and connect up to my fucking NIS server so I can fuck off and login to my Solaris server and run X11
Fucking WHO needs mainframe oriented network transparency in the 21 century leave that shit in 1989 like it belongs
Ok, then buy an rtx 4090 for every computer in the house
I know this as a fact that Nvidia GT730 under Nouveau and Intel HD 2500 can run Wayland without issues
You misunderstand, I don’t want crap graphics on every computer, I want the 4090 driving every computer without having to buy one per computer.
That’s what you could do with network transparency.
RDP (Remote Desktop Protocol) works leaps, bounds and miles better than the 1989 X11 Network Transparency system ever did. Especially so that X11 was never intended for hardware accelerated compositing or 3D apps.
PCs were not intended to have more than 640kb of ram and yet.
The blame can squarely be placed on nvidia for this decrepitude of X11 and its functionality which is in contradiction of nvidia’s unlimited profit ambitions.
RDP is the anachronism. Why would I want to stream a whole desktop environement with its own separate taskbar, clock, whole user environement. Especially given how janky and laggy it is.
No, I want to stream -just- the application, it should use my system’s color and temperature scheme, interoperate clipboard and drag&drop, be basically indistinguishable from a locally running app, except streaming at 500mbps AV1 hardware encoded, 12 ms latency max, 16k resolution, yes this is not a typo, 16 bit hdr, hdr that actually works, the sounds works too, works every time, yes 8 channel 192khz 24 bit lossless. Also capable of pure IP multicast streaming. Yes that means one application instance visible on multiple computer, at the same time and can be interacted with multiple users at the same time with -no- need for the app to be aware if any of this.
Do that with no jank and I’ll sing wayland’s praises.
There is a project called waypipe
Also I call bullshit on XOrg supporting anything you said without issues. In my experience, it can shit itself by itself when you look at it wrong.
Joke’s on you I can’t afford an HDR display & also I’m colorblind.
You can still profit from the increase in brightness and contrast! Doesn’t make a good HDR screen any cheaper though…
Was gonna say the same thing, HDR is like flac and expensive amps for audiophiles Maybe we should start calling them visualphiles ? 🤷♂️
“FLAC? Mate I destroyed my ears when I was 14 and listening to Linkin Park MP3s grabbed off Kazaa in the cheapest chinese earbuds my allowance could buy, at the highest volume my fake iPod could drive. I cannot hear the subtleties in your FLAC if I tried.”
Cheek aside I believe the word would be Videophiles to pair with Audiophiles.
HDR is almost useless to me. I’ll switch when wayland has proper remote desktop support (lmk if it does but I’m pretty sure it does not)
Seems like there’s a bunch of solutions out there:
As of 2020, there are several projects that use these methods to provide GUI access to remote computers. The compositor Weston provides an RDP backend. GNOME has a remote desktop server that supports VNC. WayVNC is a VNC server that works with compositors, like Sway, based on the wlroots library. Waypipe works with all Wayland compositors and offers almost-transparent application forwarding, like ssh -X.
Do these not work for your use case?
I did try those, but it might be the fault of my nvidia card for not working. The issue was that I wasn’t able to understand nor fix any problems that popped up. I’ll try it out again when I get a new GPU
Yeah, Nvidia really sucks on Linux unfortunately and they simply do not care very much.
Does wine run on wayland?
Edit, had to look up wth HDR is. Seems like a marketing gimmick.
Anti Commercial AI thingy
Inserted with a keystroke running this script on linux with X11
#!/usr/bin/env nix-shell #!nix-shell -i bash --packages xautomation xclip sleep 0.2 (echo '::: spoiler Anti Commercial AI thingy [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/) Inserted with a keystroke running this script on linux with X11 ```bash' cat "$0" echo '``` :::') | xclip -selection clipboard xte "keydown Control_L" "key V" "keyup Control_L"
It isn’t, it’s just that marketing is really bad at displaying what HDR is about.
HDR means each color channel that used 8 bits can now use 10 bits, sometimes more. That means an increase of 256 shades per channel to 1024, allowing a higher range of shades to be displayed in the same picture, and avoiding the color banding problem:
Thank you.
I assume HDR has to be explicitly encoded into images (and moving images) then to have true HDR, otherwise it’s just upsampled? If that’s the case, I’m also assuming most media out there is not encoded with HDR, and further if that’s correct, does it really make a difference? I’m assuming upsampling means inferring new values and probably using gaussian, dithering, or some other method.
Somewhat related, my current screens support 4k, but when watching a 4k video at 60fps side by side on a screen at 4k resolution and another 1080p resolution, no difference could be seen. It wouldn’t surprise me if that were the same with HDR, but I might be wrong.
Anti Commercial AI thingy
Inserted with a keystroke running this script on linux with X11
#!/usr/bin/env nix-shell #!nix-shell -i bash --packages xautomation xclip sleep 0.2 (echo '::: spoiler Anti Commercial AI thingy [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/) Inserted with a keystroke running this script on linux with X11 ```bash' cat "$0" echo '``` :::') | xclip -selection clipboard xte "keydown Control_L" "key V" "keyup Control_L"
yes, from the capture (camera) all the way to distribution the content has to preserve the HDR bit depth. Some content on YouTube is in HDR (that is noted in the quality settings along with 1080p, etc), but the option only shows if both the content is HDR and the device playing it has HDR capabilities.
Regarding streaming, there is already a lot of HDR content out there, especially newer shows. But stupid DRM has always pushed us to alternative sources when it comes to playback quality on Linux anyway.
no difference could be seen
If you’re not seeing difference of 4K and 1080p though, even up close, maybe your media isn’t really 4k. I find the difference to be quite noticeable.
yes, from the capture (camera) all the way to distribution the content has to preserve the HDR bit depth.
Ah, that’s what I thought. Thanks.
If you’re not seeing difference of 4K and 1080p though, even up close, maybe your media isn’t really 4k. I find the difference to be quite noticeable.
I tried with the most known test video Big Buck Bunny. Their website is now down and the internet archive has it, but I did the test back when it was up. Also found a few 4k videos on youtube and elsewhere. Maybe me and the people I tested it with aren’t sensitive to 4k video on 30-35 inch screens.
Anti Commercial AI thingy
Inserted with a keystroke running this script on linux with X11
#!/usr/bin/env nix-shell #!nix-shell -i bash --packages xautomation xclip sleep 0.2 (echo '::: spoiler Anti Commercial AI thingy [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/) Inserted with a keystroke running this script on linux with X11 ```bash' cat "$0" echo '``` :::') | xclip -selection clipboard xte "keydown Control_L" "key V" "keyup Control_L"
aren‘t sensitive to 4K video
So you’re saying you need glasses?
But yes, it does make a difference how much of your field of view is covered. If it’s a small screen and you’re relatively far away, 4K isn’t doing anything. And of course, you need a 4K capable screen in the first place, which is still not a given gor PC monitors, precisely due to their size. For a 21" desktop monitor, it’s simply not necessary. Although I‘d argue, less than 4K on a 32" screen, that’s like an arms length away from you (like on a desktop), is noticeably low res.
So you’re saying you need glasses?
No. Just like some people aren’t sensitive to 3D movies, we aren’t sensitive to 4k 🤷
Anti Commercial AI thingy
Inserted with a keystroke running this script on linux with X11
#!/usr/bin/env nix-shell #!nix-shell -i bash --packages xautomation xclip sleep 0.2 (echo '::: spoiler Anti Commercial AI thingy [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/) Inserted with a keystroke running this script on linux with X11 ```bash' cat "$0" echo '``` :::') | xclip -selection clipboard xte "keydown Control_L" "key V" "keyup Control_L"