(no title)
SomeoneOnTheWeb | 7 months ago
This is one of the stupid things with many monitors, showing HDR at 250 nits is worse than showing no HDR at all. So no matter what you do, 99% of HDR content will look bad on your screen.
SomeoneOnTheWeb | 7 months ago
This is one of the stupid things with many monitors, showing HDR at 250 nits is worse than showing no HDR at all. So no matter what you do, 99% of HDR content will look bad on your screen.
Tade0|7 months ago
Interestingly my laptop's display reaches 500 nits and that is already painfully high outside of midday hours. My phone goes to 875 and I find that only to be useful outside in the summer sun.
SomeoneOnTheWeb|7 months ago
simoncion|7 months ago
I disagree. The wide color gamut is -for me- a huge thing about HDR. My VA monitor provides ~300 nits of brightness and I've been quite happy with the games that didn't phone in their HDR implementation.
Plus, any non-trash HDR monitor will tell the computer it's attached to what its maximum possible brightness is, so the software running on that computer can adjust its renderer accordingly.
dartharva|7 months ago
My monitor does do that, but alas the software itself (Windows 10) wasn't good enough to adjust stuff correctly. It did made the decision to switch to ArchLinux easier by being one less thing I'll be missing
geraldwhen|7 months ago
Games are just truly awful in making scenes completely in viewable, even when the HDR areas, the blacks and whites, have interactive elements in them you need to see and know about.
zapzupnz|7 months ago
Not that many games on the console that take advantage of it, mind you. More testing needed.