• 7 Posts
  • 279 Comments
Joined 1 year ago
cake
Cake day: June 10th, 2023

help-circle







  • I had thought it was about the color profile because with hdr disabled from system settings, enabling the built in color profile desaturates colors quite a bit and does some kind of perceived brightness to luminosity mapping that desaturates bright / dark hdr content even more. Although I don’t think that’s the cause of my problems anymore.

    Thanks to your tip about kscreen-doctor, I could try different combinations of hdr / wcg / edid and see how the colors look with different combinations:

    I think there must be something wrong with my screen since the hdr reduces saturation more than anything else. Anyways, thanks for the good work

    Edit: Tried this with an amd gpu. hdr+wcg works as expected without muted colors. hdr without wcg still significantly desaturates colors, so I guess that’s a monitor bug. Now to figure out gpu passthrough… (Edit 2: It seems to just work??)

    Side note, when I turn off hdr only from kscreendoctor the display stays in hdr mode until it turns off and on again, that didn’t happen with nvidia

    Edit 3: Found something weirder… Hdr colors are muted on nvidia gpu and seems vibrant with the amd igpu. If I plug the monitor to the motherboard (amd), enable hdr, then unplug and plug it into the nvidia gpu, the colors are still vibrant??? I can disable and enable hdr again and again and they aren’t affected. They’re even fine when hdr is enabled without wcg??? But if I fully turn off the monitor and back on they once again become muted with hdr. Weird ass behavior







  • No? Afaik vsync prevents the gpu from sending half drawn frames to the monitor, not the monitor from displaying them. The tearing happens in the gpu buffer Edit: read the edit

    Though I’m not sure how valid the part about latency is. In the worst case scenario (transfer of a frame taking the whole previous frame), the latency of an lcd can only be double that of a crt at the same refresh rate, which 120+ hz already compensates for. And for the inherent latency of the screen, most gaming lcd monitors have less than 5 ms of input lag while a crt on average takes half the frame time to display a pixel, so 8 ms.

    Edit: thought this over again. On crt those 2 happen simultaneously so the total latency is 8ms + pixel response time (which I don’t know the value of). On lcds, the transfer time should be (video stream bandwidth / cable bandwidth) * frame time. And that runs consecutively with the time to display it, which is frame time / 2 + pixel response time. Which could exceed the crt’s latency

    BUT I took the input lag number from my monitor’s rtings page and looking into how they get it, it seems it includes both the transfer time and frame time / 2 and it’s somehow still below 5 ms? That’s weird to me since for that the transfer either needs to happen within <1 ms (impossible) or the entire premise was wrong and lcds do start drawing before the entire frame reaches them

    Although pretty sure that’s still not the cause of tearing, which happens due to a frame being progressively rendered and written to the buffer, not because it’s progressively transferred or displayed