top | item 44369891

(no title)

pedrocr | 8 months ago

That's probably better than most scaling done on Wayland today because it's doing the rendering directly at the target resolution instead of doing the "draw at 2x scale and then scale down" dance that was popularized by OSX and copied by Linux. If you do it that way you both lose performance and get blurry output. The only corner case a compositor needs to cover is when a client is straddling two outputs. And even in that case you can render at the higher size and get perfect output in one output and the same downside in blurryness in the other, so it's still strictly better.

It's strange that Wayland didn't do it this way from the start given its philosophy of delegating most things to the clients. All you really need to do arbitrary scaling is tell apps "you're rendering to a MxN pixel buffer and as a hint the scaling factor of the output you'll be composited to is X.Y". After that the client can handle events in real coordinates and scale in the best way possible for its particular context. For a browser, PDF viewer or image processing app that can render at arbitrary resolutions not being able to do that is very frustrating if you want good quality and performance. Hopefully we'll be finally getting that in Wayland now.

discuss

order

kccqzy|8 months ago

> doing the "draw at 2x scale and then scale down" dance that was popularized by OSX

Originally OS X defaulted to drawing at 2x scale without any scaling down because the hardware was designed to have the right number of pixels for 2x scale. The earliest retina MacBook Pro in 2012 for example was 2x in both width and height of the earlier non-retina MacBook Pro.

Eventually I guess the cost of the hardware made this too hard. I mean for example how many different SKUs are there for 27-inch 5K LCD panels versus 27-inch 4K ones?

But before Apple committed to integer scaling factors and then scaling down, it experimented with more traditional approaches. You can see this in earlier OS X releases such as Tiger or Leopard. The thing is, it probably took too much effort for even Apple itself to implement in its first-party apps so Apple knew there would be low adoption among third party apps. Take a look at this HiDPI rendering example in Leopard: https://cdn.arstechnica.net/wp-content/uploads/archive/revie... It was Apple's own TextEdit app and it was buggy. They did have a nice UI to change the scaling factor to be non-integral: https://superuser.com/a/13675

pedrocr|8 months ago

> Originally OS X defaulted to drawing at 2x scale without any scaling down because the hardware was designed to have the right number of pixels for 2x scale.

That's an interesting related discussion. The idea that there is a physically correct 2x scale and fractional scaling is a tradeoff is not necessarily correct. First because different users will want to place the same monitor at different distances from their eyes, or have different eyesight, or a myriad other differences. So the ideal scaling factor for the same physical device depends on the user and the setup. But more importantly because having integer scaling be sharp and snapped to pixels and fractional scaling a tradeoff is mostly a software limitation. GUI toolkits can still place all ther UI at pixel boundaries even if you give them a target scaling of 1.785. They do need extra logic to do that and most can't. But in a weird twist of destiny the most used app these days is the browser and the rendering engines are designed to output at arbitrary factors natively and in most cases can't because the windowing system forces these extra transforms on them. 3D engines are another example, where they can output whatever arbitrary resolution is needed but aren't allowed to. Most games can probably get around that in some kind of fullscreen mode that bypasses the scaling.

I think we've mostly ignored these issues because computers are so fast and monitors have gotten so high resolution that the significant performance penalty (2x easily) and introduced blurryness mostly goes unnoticed.

> Take a look at this HiDPI rendering example in Leopard

That's a really cool example, thanks. At one point Ubuntu's Unity had a fake fractional scaling slider that just used integer scaling plus font size changes for the intermediate levels. That mostly works very well from the point of view of the user. Because of the current limitations in Wayland I mostly do that still manually. It works great for single monitor and can work for multiple monitors if the scaling factors work out because the font scaling is universal and not per output.

trinix912|8 months ago

Out of curiosity, do you happen to know why Apple thought that would be the cause for low adoption among 3rd party apps? Isn't scaling something that the OS should handle, that should be completely transparent, something that 3rd party devs can forget exists at all? Was it just that their particular implementation required apps to handle things manually?

cosmic_cheese|8 months ago

Even today you run into the occasional foreign UI toolkit app that only renders at 1x and gets scaled up. We’re probably still years out from all desktop apps handling scaling correctly.

frizlab|8 months ago

Completely unrelated but man was Aqua beautiful

ndiddy|8 months ago

Wayland has supported X11 style fractional scaling since 2022: https://wayland.app/protocols/fractional-scale-v1 . Both Qt and GTK support fractional scaling on Wayland.

bscphil|8 months ago

Rather annoyingly, the compositor support table on this page seems to be showing only the latest version of each compositor (plus or minus a month or two, e.g. it's behind on KWin). I assume support for the protocol predates these versions for the most part? Do you know when the first versions of KDE and Gnome to support the protocol were released? Asking because some folks in this thread have claimed that a large majority of shipped Wayland systems don't support it, and it would be interesting to know if that's not the case (e.g. if Debian stable had support in Qt and GTK applications).

hedora|8 months ago

Fractional scaling is the problem, not the solution! It replaces rendering directly at the monitor’s DPI, which is strictly better, and used to be well-supported under Linux.

resonious|8 months ago

As someone who just uses Linux but doesn't write compositor code or really know how they work: Wayland supports fractional scaling way better than X11. At least I was unable to get X11 to do 1.5x scale at all. The advice was always "just increase font size in every app you use".

Then when you're on Wayland using fractional scaling, XWayland apps look very blurry all the time while Wayland-native apps look great.

waldiri|8 months ago

As a similar kind of user, I set Xft.dpi: 130 in .Xresources.

If I want to use multiple monitors with different dpis, then I update it on every switch via echoing the above to `xrdb -merge -`, so newly launched apps inherit the dpi of the monitor they were started on.

Dirty solution, but results are pretty nice and without any blurriness.

1718627440|8 months ago

xrandr --output HDMI1 --scale 1.5x1.5

sho_hn|8 months ago

> doing the "draw at 2x scale and then scale down" dance that was popularized by OSX and copied by Linux

Linux does not do that.

> It's strange that Wayland didn't do it this way from the start

It did (initially for integer scale factors, later also for fractional ones, though some Wayland-based environments did it earlier downstream).

maxdamantus|8 months ago

> Linux does not do that.

It did (or at least Wayland compositors did).

> It did

It didn't.

I complained about this a few years ago on HN [0], and produced some screenshots [1] demonstrating the scaling artifacts resulting from fractional scaling (1.25).

This was before fractional scaling existed in the Wayland protocol, so I assume that if I try it again today with updated software I won't observe the issue (though I haven't tried yet).

In some of my posts from [0] I explain why it might not matter that much to most people, but essentially, modern font rendering already blurs text [2], so further blurring isn't that noticable.

[0] https://news.ycombinator.com/item?id=32021261

[1] https://news.ycombinator.com/item?id=32024677

[2] https://news.ycombinator.com/item?id=43418227

jdsully|8 months ago

Windows tried this for a long time and literally no app was able to make it work properly. I spent years of my life making Excel have a sane rendering model that worked on device independent pixels and all that, but its just really hard for people not to think in raw pixels.

kllrnohj|8 months ago

And yet every Android app does it just fine :)

The real answer is just it's hard to bolt this on later, the UI toolkit needs to support it from the start

pwnna|8 months ago

So I don't understand where the meme of the blurry super-resolution based down sampling comes from. If that is the case, what is super-resolution antialiasing[1] then? Images when rendered at higher resolution than downsampled is usually sharper than an image rendered at the downsampled resolution. This is because it will preserve the high frequency component of the signal better. There are multiple other downsampling-based anti-aliasing technique which all will boost signal-to-noise ratio. Does this not work for UI as well? Most of it is vector graphics. Bitmap icons will need to be updated but the rest of UI (text) should be sharp.

I know people mention 1 pixel lines (perfectly horizontal or vertical). Then they go multiply by 1.25 or whatever and go like: oh look 0.25 pixel is a lie therefore fractional scaling is fake (sway documentation mentions this to this day). This doesn't seem like it holds in practice other than from this very niche mental exercise. At sufficiently high resolution, which is the case for the display we are talking about, do you even want 1 pixel lines? It will be barely visible. I have this problem now on Linux. Further, if the line is draggable, the click zones becomes too small as well. You probably want something that is of some physical dimension which will probably take multiple pixels anyways. At that point you probably want some antialiasing that you won't be able to see anyways. Further, single pixel lines don't have to be exactly the color the program prescribed anyway. Most of the perfectly horizontal and vertical lines on my screen are all grey-ish. Having some AA artifacts will change its color slightly but don't think it will have material impact. If this is the case, then super resolution should work pretty well.

Then really what you want is something as follows:

1. Super-resolution scaling for most "desktop" applications.

2. Give the native resolution to some full screen applications (games, video playback), and possibly give the native resolution of a rectangle on screen to applications like video playback. This avoids rendering at a higher resolution then downsampling which can introduce information loss for these applications.

3. Now do this on a per-application basis, instead of per-session basis. No Linux DE implements this. KDE implements per-session which is not flexible enough. You have to do it for each application on launch.

[1]: https://en.wikipedia.org/wiki/Supersampling

c-hendricks|8 months ago

> So I don't understand where the meme of the blurry super-resolution based down sampling comes from. If that is the case, what is super-resolution antialiasing

It removes jaggies by using lots of little blurs (averaging)

wmf|8 months ago

None of the toolkits (Motif, Tk, Gtk, Qt, etc.) could handle fractional scaling so if Wayland had taken the easy way out it would break every app.

nixosbestos|8 months ago

Except for the fact that Wayland has had a fractional scaling protocol for some time now. Qt implements it. There's some unknown reason that GTK won't pick it up. But anyway, it's definitely there. There's even a beta-level implementation in Firefox, etc.

lostmsu|8 months ago

Why is Wayland trying to monkey patch something that's broken elsewhere?

hedora|8 months ago

I’ll just add that it is much better than fractional scaling.

I switched to high dpi displays under Linux back in the late 1990’s. It worked great, even with old toolkits like xaw and motif, and certainly with gtk/gnome/kde.

This makes perfect sense, since old unix workstations tended to have giant (for the time) frame buffers, and CRTs that were custom-built to match the video card capabilities.

Fractional scaling is strictly worse than the way X11 used to work. It was a dirty hack when Apple shipped it (they had to, because their third party software ecosystem didn’t understand dpi), but cloning the approach is just dumb.

zozbot234|8 months ago

Isn't OS X graphics supposed to be based on Display Postscript/PDF technology throughout? Why does it have to render at 2x and downsample, instead of simply rendering vector-based primitives at native resolution?

kalleboo|8 months ago

OS X could do it, they actually used to support enabling fractional rendering like this through a developer tool (Quartz Debug)

There were multiple problems making it actually look good though - ranging from making things line up properly at fractional sizes (e.g. a "1 point line" becomes blurry at 1.25 scale), and that most applications use bitmap images and not vector graphics for their icons (and this includes the graphic primitives Apple used for the "lickable" button throughout the OS.

edit: I actually have an iMac G4 here so I took some screenshots since I couldn't find any online. Here is MacOS X 10.4 natively rendering windows at fractional sizes: https://kalleboo.com/linked/os_x_fractional_scaling/

IIRC later versions of OS X than this actually had vector graphics for buttons/window controls

astrange|8 months ago

No, CoreGraphics just happened to have drawing primitives similar to PDF.

Nobody wants to deal with vectors for everything. They're not performant enough (harder to GPU accelerate) and you couldn't do the skeumorphic UIs of the time with them. They have gotten more popular since, thanks to flat UIs and other platforms with free scaling.

qarl|8 months ago

You're thinking of NeXTSTEP. Before OS X.

wmf|8 months ago

No, I think integer coordinates are pervasive in Carbon and maybe even Cocoa. To do fractional scaling "properly" you need to use floating point coordinates everywhere.

crest|8 months ago

If you did it right you would render the damaged area of each window for each display it's visible on, but that would require more rigerous engineering than our software stacks have.

account42|8 months ago

It would also mean that moving the window now either needs to wait for repaint or becomes a hell lot more complicated and still have really weird artifacts.

lotharcable|8 months ago

It is using OpenGL to draw instead of using X11.

Which pretty much means that it is using the same code paths and drivers that get used in Wayland.

6510|8 months ago

If the initial picture is large enough the blur from down-scaling isn't so bad. Say 1.3 pixel per pixel vs 10.1 pixels per pixel.

account42|8 months ago

That also means you need a 10x as powerful GPU though.