• MudMan@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    11 hours ago

    Nah, man, they finally fixed it at some point on Windows 11. PCs for the longest time struggled with it, but these days out of four dedicated PC monitors being used by different people in my house right now three are HDR-compatible and working just fine on Windows out of the box, as are multiple portable devices (including, incidentally, the Steam Deck OLED). Plus all TVs in the house, obviously.

    HDR was standardized for TVs and started getting content almost a decade ago, it’s been a gaming and video default on consoles for two hardware generations and is increasingly a default feature on even cheap PC monitors. I agree that Windows took waaaay too long to get there, which was incredibly frustrating, considering MS were supporting it just fine on Xbox, but it works well now and I miss it immediately when shifting to Linux on the same setup.

    VRR, too, but the situation there is a bit different.

    • stevedice@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      11 hours ago

      I run W11 daily and it isn’t fixed. Sure, HDR content works but my screen needs to flicker for a bit before it gets enabled and sometimes it doesn’t. Don’t even get me started on games that require you to have it on in the system before you can turn it on in the game. Sure, I could just leave it on all the time but then SDR content looks washed out. I’m not saying it doesn’t work, just that it’s kinda annoying. As you mentioned, I can just turn on my TV, play an HDR video and it works, then switch to a SDR content and it also works. When am I getting that experience on PC?

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        2
        ·
        11 hours ago

        Hm. SDR content on HDR containers have been working well for me on both DP1.4 and HDMI 2.1-enabled GPUs, no washed out content, which I did use to have. It did not work over HDMI on an Intel A770 I tested where their weird DP-to-HDMI adaptor didn’t support HDR at all (I hear it’s working on the Battlemage cards, thankfully), but it does work over DP, and it also works well out of the box on my monitors using the integrated HDMI out on a 7 series Ryzen APU, so I’m guessing it’s doing fine on newer AMD cards, too.

        I do believe you that it’s borked for you, and if you’re on a last-gen card with HDMI 2.0 I have seen it do the old “washed out SDR” garbage even under Win11, so I absolutely concede that it’s still crappier than the way more reliable solutions for TV-focused hardware. Still, it works way more reliably than it used to on Windows and it’s way more finicky and harder to set up on Linux than it is on Windows these days (or outright unsupported, depending on your flavor of DE).

        • stevedice@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          10 hours ago

          I actually upgraded to Windows 11 specifically because I was told they fixed HDR. I do have an RX7600 so it’s technically “last gen” but I’m running DP (I have no idea which version but it has to be at least 1.4 because it runs 1080p at 180Hz). Washed out SDR content isn’t that bad, I actually didn’t even notice until I dragged a window playing SDR content to my second monitor that doesn’t have HDR and the blacks looked, well, blacker. I don’t doubt that it’s worse on Linux, I wasn’t trying to defend it. Just wanted to point out that it seems like no OS that isn’t designed to run only on TVs gives a crap about the HDR experience.

          • MudMan@fedia.io
            link
            fedilink
            arrow-up
            2
            ·
            10 hours ago

            Man, I hated it. The only reason I give Windows (and GPU manufacturers, I suppose) credit for improving it this gen is that I was trying to output PC games to a HDR TV fairly early on and I ended up having to choose between the crappy SDR output or playing worse looking games on console with HDR enabled and it was a dealbreaker. It is a godsend to be able to leave HDR on all the time on my monitors and just… not have to think about it.

            SDR for me now either looks fine as-is or is picked up by AutoHDR and remapped. It now works as I would expect, and at high framerates and resolutions, too, as it seemed to automatically pick out the right type of DSC to fit my setup.

            I’ll be honest, when I got a high refresh rate monitor I was completely sure I wasn’t going to be able to get it all working at once, based on previous experience, but it just did. It sucks to learn that experience isn’t universal. Especially since the RX7600 should have all the hardware it needs to do this. That integrated AMD GPU I mentioned did it all just fine out of the box for me as well and is of that same generation, the 7600 should work the same way.

            The temptation is to try to troubleshoot it with you and suggest it’s a setup issue, but my entire point here is that it should work out of the box every time, or at least tell you what to push to change it if it’s supported, I don’t care what OS we’re talking about.