This is where I disagree. The whole point of HDR is so that the director can determine the light output (not just white) of pixels in a scene. It’s so that the TV can mimic the real world with real world levels of light output. What you are talking about is contrast when referencing deep blacks, as well as perceived brightness in comparison to viewing conditions. Even then, when you’re outputting 4 times the light output compared to OLED, even perfect blacks won’t help in the contrast gap. When you have a couple thousand nits of light output shining in your face from headlights of a car in a scene, no amount of perceived brightness in a dark room can give you that same feeling at a couple hundred nits.
I’ve always been all about TV size, and calibrating to nail the colors accurately, and call it a day. I’m not even that impressed at the leap from 1080 to 4K. Peak brightness has changed that for me to the point that I prefer watching on smaller screens but with higher light output. You mentioned theaters looking blindingly bright, yea, that’s what I thought till I first saw 2000 nits in my face lol. The gap between 50 nits in a theater and 2000 nits is massive. No amount of contrast or perceived brightness can compete.
Edit - here’s a review of Sony’s 10000 nits monster:
https://www.cnet.com/news/tvs-are-on...ght-is-enough/