top | item 30769211

(no title)

nimbix | 4 years ago

IMHO 4k was the worst thing to ever happen to monitors. Now we have screens ranging from 24" to 43+" all using the same physical resoluton. This means that at all the most popular screen sizes you need to use some weird 1.5 or 1.25 scaling factor which results in all sorts of weird rendering quirks, like 1px gaps when rendering engines try to fit elements to pixel boundaries. Plus SVGs often rasterize to a blurry mess since there is no rendering magic that could make a 1px line rasterize nicely over one and a half pixel.

discuss

order

legalcorrection|4 years ago

Windows does fractional display scaling just fine. Most software these days supports it and renders perfectly. Yes, occasionally you run into some legacy pixel-based UI and get a blurry window. This will almost always be some utility software, not a program you interact with for long periods of time.

kmeisthax|4 years ago

1080p was the same way. 1440p was almost exclusively an Apple thing. Monitor manufacturers want to sell size, not resolution.

I'm starting to wonder if vector formats and layout engines need pixel hinting capabilities, like font formats already do. As far as I'm aware browser engines already have to hack in these sorts of layout tweaks to avoid, say, float-based layouts collapsing into multiple lines at odd zoom percentages. Apple's insistence on integer scaling ratios is noble, but it also renders the display part of their hardware ecosystem an island.