But how does one even know? That’s the problem with HDR. Until you’ve seen it done right, for a sustained period, in conditions relevant to your home conditions, it’s unlikely that you can make an accurate enough to comparison to tell if your hardware, software or some general part of the setup is at fault for it looking crap.
I mean, we have a TV that can do HDR. Does it really look much better than normal 4K? No. But is that the TV? The cable? The input device? I assume it’s the TV, but I have no idea. Likewise my monitor can do HDR, but it looks crap - obviously vastly worse in Windows, like totally washed-out, and depending on the game, somewhere between slightly and vastly worse.
I have a friend with a new TV, and Xbox XS, and definitely a brand-new cable, and HDR - does that look good? Well the black is slightly blacker, and that’s about all I can tell. This isn’t a vision issue, I have ridiculously good vision and colour-sense (especially for my age, still don’t wear glasses at 42). I have seen displays in stores which were incredible, and were HDR, though they all seemed to say “QLED”, and I have no idea if a home setup would get that much out of them.
Don’t agree if we’re talking for games. I can easily see the benefit of 144Hz refresh, because you can just turn off Vsync and get higher framerates and less/no tearing. It’s also really obvious how it’s even easier to track/spot moving objects on those monitors as compared to lower-Hz-rate ones.
Obviously they’re pretty pointless for phones and TVs which aren’t used for gaming for where that Hz-rate can’t be used at full-resolution and stuff (or the things connected to the TV can’t do a high enough framerate to make it worthwhile).
Maybe you’re talking about 240Hz and up though and I can’t comment on that as I’ve never seen it go.