Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Highdpi is awful. #796

Closed
icefoxen opened this issue Feb 17, 2019 · 28 comments
Closed

Highdpi is awful. #796

icefoxen opened this issue Feb 17, 2019 · 28 comments

Comments

@icefoxen
Copy link
Contributor

Making this an actual issue here so I can get around to doing something about it someday. It's been a pain in my ass since forever; people using ggez constantly ask why they get a 1200x900 pixel window when they ask for an 800x600 one, and how in the name of Eris they can turn it off.

Basically, if you are trying to make something pretty, you NEED to be able to actually get accurate information on where the hell pixels are, without things trying to hide it from you. The easiest way to do this is to actually just turn off any highdpi nonsense, rather than forcing the user to try to figure out everywhere they need to put in a conversion factor. Does hidpi scaling apply to location of mouse events, for example? I don't even know.

To demonstrate why this can be a problem, observe this program: https://github.com/icefoxen/heckin_dpi . You can turn on MSAA or such but you still get a jerky effect where the lines walk across the screen slightly out of sync with each other, even though the actual mathematical distance between the lines is always the same.

@mitchmindtree
Copy link
Contributor

@icefoxen can you please let us know the exact issues you are getting? Is the incorrect window size the only issue? Which platforms are you experiencing issues with? What specifically happens on each one that you don't like?

Also you've probably already read this, but the winit::dpi module-level docs is an essential read for understanding the way that DPI in winit works and why it works the way it does.

There are also some known issues on X11 which I think are fixed by #606 but that PR is waiting on a rebase I believe.

For the record, I'm very happy with my "hidpi" experience in winit since the francesca's recentish overhaul (other than the issues addressed by #606), but I haven't used windows since then.

@icefoxen
Copy link
Contributor Author

Yeah, give me a few days to compile a list of bugs.

@fschutt
Copy link

fschutt commented Feb 25, 2019

Also fschutt/azul#61 - I "solved" it by shelling out to gsettings on X11:

/// Return the DPI on X11 systems
#[cfg(target_os = "linux")]
fn linux_get_hidpi_factor(monitor: &MonitorId, events_loop: &EventsLoop) -> f64 {

    use std::process::Command;
    use glium::glutin::os::unix::EventsLoopExt;

    let winit_dpi = monitor.get_hidpi_factor();
    let winit_hidpi_factor = env::var("WINIT_HIDPI_FACTOR").ok().and_then(|hidpi_factor| hidpi_factor.parse::<f64>().ok());
    let qt_font_dpi = env::var("QT_FONT_DPI").ok().and_then(|font_dpi| font_dpi.parse::<f64>().ok());

    // Execute "gsettings get org.gnome.desktop.interface text-scaling-factor" and parse the output
    let gsettings_dpi_factor =
        Command::new("gsettings")
            .arg("get")
            .arg("org.gnome.desktop.interface")
            .arg("text-scaling-factor")
            .output().ok()
            .map(|output| output.stdout)
            .and_then(|stdout_bytes| String::from_utf8(stdout_bytes).ok())
            .map(|stdout_string| stdout_string.lines().collect::<String>())
            .and_then(|gsettings_output| gsettings_output.parse::<f64>().ok());

    let options = [winit_hidpi_factor, qt_font_dpi, gsettings_dpi_factor];
    options.into_iter().filter_map(|x| *x).next().unwrap_or(winit_dpi)
}

For example, my display has a (calculated) DPI factor of 1.25, but a gsettings factor of 1.0 - it would be nice if winit would have a .set_hidpi_factor() function, so that users can override winits DPI factor with a custom one (without using environment variables). Right now I have to make custom back-and-forth calculations surrounding the window size / DPI.

@icefoxen
Copy link
Contributor Author

Ok, sorry for being grumpy. After some thought and some time doing things other than fighting frustrating bugs, I agree that a set_highdpi_factor is really what is needed. The problem is basically that sometimes you are fine with logical pixel sizes (doing GUI layout, rendering easily-scaled bitmaps) and sometimes you want real pixels (rasterizing text, maybe setting up transforms for 3D graphics). Mixing the two is hard, so most users really just want to do one or the other, and hidpi is newer and FAR more complicated, as the above post demonstrates. So users, especially new ones who are trying to explore, often default to "just use real pixels until I get things working right".

So, advantages to having winit have a set_highdpi_factor():

  • Useful for testing how things will look and behave on different monitors
  • Useful for working around systems that disagree how it should be set, as fschutt does
  • If we don't provide it, everyone in the world is going to write their own custom, slightly-incompatible version of it
  • Since afaict it looks like backends are getting rewritten for event loop 2.0 anyway a la stdweb support for eventloop 2.0 #797, if we do it now and have all backends use it consistently from the start, it will be far easier than trying to retrofit it in later (which I tried to do ages ago and rapidly got way too complicated).

Thoughts?

@elinorbgr
Copy link
Contributor

elinorbgr commented Feb 27, 2019 via email

@chrisduerr
Copy link
Contributor

Just to chime in from personal experience, the only issue I can see with dpi scaling would be that the window is automatically resized. A lot of code of Alacritty uses physical dimensions internally and in general that doesn't create any issues since it can just be converted.

Maybe if you really don't want to deal with resizing due to DPI, being able to just disable that would be enough?

@tangmi
Copy link
Contributor

tangmi commented Mar 17, 2019

Something that helped me understand DPI was to put "logical-" in front of anything (i.e. all application logic) that wasn't directly dealing with GPU textures (and use "physical-" for actual screen pixels).

For example, in 2D game, I might have a viewport that I want to be 800 pixels wide. I read that as 800 logical pixels wide, which needs a conversion if I want to figure out exactly how big that backing texture needs to be (if the hidpi_factor is 2, then I'd want a 1600 physical pixel wide texture). A normal ortho matrix set up for some logical size should work, regardless of the render target's physical size (just don't do weird things with SV_POSITION in the fragment shader...).

@Osspial
Copy link
Contributor

Osspial commented Mar 28, 2019

I don't think I want to add a set_hidpi_factor function, since I can easily see that being abused to override DPI in a manner useful for the developer but that breaks functionality for downstream users. However, I'd support making the WINIT_HIDPI_FACTOR environment variable work on all platforms, since we take a strong stance on forcing our users to handle DPI scaling but don't actually provide any tools to help debug DPI support when it's broken. Hell, I'd even be fine with renaming it to WINIT_DEBUG_HIDPI_FACTOR and only enabling it when compiled debug_assertions to make it aggressively apparent that it should never be used to override DPI for actual users, but we need to have some sort of support for developers so they can be confident their HiDPI code actually works.

Regarding @fschutt using WINIT_HIDPI_FACTOR to shell out to gsettings - that isn't a behavior we want to support in downstream applications, since it's hacky and really should be handled by us. According to the issue they linked, the issue is that Winit calculates the HiDPI factor based on the monitor's physical dimensions instead of asking the OS what the DPI factor should be. From my limited investigations it seems we can fix that behavior by using X resources or looking at desktop environment settings, which are solutions we should investigate and potentially bring onto master so we can make the DPI factor works as expected.

@fschutt
Copy link

fschutt commented Mar 28, 2019

From my limited investigations it seems we can fix that behavior by using X resources or looking at desktop environment settings, which are solutions we should investigate and potentially bring onto master so we can make the DPI factor works as expected.

Okay, a few problems:

  • This method can't handle per-monitor DPI, because a global environment variable is, well, global for all monitors
  • The reason why I want a set_hidpi_factor is that after all these DPI problems I simply don't trust you to not mess it up again. Right now I have to calculate an "inverse" scale of the window and then set the window size to that, which is way, way hackier than a set_hidpi_factor function.
  • Both xft.dpi and xrandr also don't support per-window DPI.
  • Even that Arch Wiki page that you linked says to use gsettings. I did contact people from the Gnome mailing list to ask them where the DPI setting from gsettings is actually stored - the result of that conversation was that you basically have to either use GNOME libraries and use the GNOME dbus protocol to get notified of dpi changes - or you have to shell out to gsettings, guess which one is easier.
  • My goal why I used gsettings was that I wanted the user to be able to change the DPI factor at runtime via the system settings GUI. This DPI factor (which is used by all applications on Linux that I'm aware of at least if they're using QT or GTK), is only stored in gsettings, so you have to use it if you want to handle DPI (and, in future GNOME versions, per-monitor DPI), then you have to more or less use gsettings in some way, XResources and xft should be only a fallback.

I know that shelling out to gsettings isn't a "clean" solution, but it happens to work very well and also gives users control to change the DPI at runtime via the system settings. XResources also doesn't have a per-monitor DPI value, just a global DPI value.

Getting the DPI from the XResources was tried already, see: #543 and #169 (comment)

@Osspial
Copy link
Contributor

Osspial commented Mar 28, 2019

@fschutt Ah, good to know that the obvious solution isn't, in fact, the best solution!

I brought up XResources because it does indeed seem more clean than calling gsettings, but I'd be totally fine with shelling out to gsettings on systems where it's available and falling back to XResources when it isn't. Even doing that would leave X11 DPI support in a much better state than it's in today.

The reason why I want a set_hidpi_factor is that after all these DPI problems I simply don't trust you to not mess it up again. Right now I have to calculate an "inverse" scale of the window and then set the window size to that, which is much, much hackier than a set_hidpi_factor function.

We'd like to provide a windowing API that gives functional DPI support for all of our users. Our current solution for X11 hiDPI factor admittedly doesn't work particularly well, but you've come up with a solution that's demonstrably better than ours; it would be foolish for us to not pull it upstream and allow all our users to use it, and even more foolish to somehow revert that change as you seem to imply we'd do.

Also, what context are you bringing up "inverse window-scaling" in? Without knowing that it's hard to know how a set_hidpi_factor function would solve your problem, though I'd be open to the idea of a DPI factor function should it prove necessary.

@fschutt
Copy link

fschutt commented Mar 28, 2019

@Osspial The "inverse" scaling factor is (gsettings_factor / winit_factor), and then I correct the window size multiplied by that. So if I have the winit scaling factor and the gsettings factor, let's say that winit HiDPI factor is 1.3 and gsettings one is 1.0 - now in order to set the window to the correct size I have to multiply their desired width / height by 0.769 (the "inverse" scaling factor = 1.0 / 1.3) - so if I want a 800x600 window, I set the size to 615x461 (800x600 * 0.769), so that when winit ineviatbly applies a 1.3 factor on top of that, you end up with an actual 800x600 window again.

So whether you introduce a set_hidpi_factor or not - making your custom HiDPI factor will be possible no matter what, just potentially with a lot of workarounds. And yes, I would like it to be solved upstream, but I'd also give the option to change it for downstream users, just in case it goes wrong again. There are also lots of discussions on this topic in this repo, DPI handling on Linux is not an easy topic because there is so much technical debt.

@chrisduerr
Copy link
Contributor

I'd like to note that even on systems where gsettings is available, a lot of users will still prefer to use Xft.dpi to set the DPI. Especially when we're talking about environments outside of GNOME.

I personally have never heard anyone complain in Alacritty that it's not possible to change its DPI scaling through gsettings, however there have been a couple of users complaining that Xft.dpi has no effect.

@icefoxen
Copy link
Contributor Author

icefoxen commented Mar 29, 2019

I don't think I want to add a set_hidpi_factor function, since I can easily see that being abused to override DPI in a manner useful for the developer but that breaks functionality for downstream users.

fschutt's "I don't trust winit not to break it again" is a bit of a hot take on the matter, but I agree with it from a different angle: I don't trust OS's/windowing libraries to get it right in all circumstances, and as a developer I want a blunt tool to be able to force it to do what I want. If you have a WINIT_HIDPI_FACTOR env var, you're going to need to implement a set_hidpi_factor function internally anyway. Let us use it.

This is exactly the main problem I have: Please trust your developers to want to do the right thing. Let developers make mistakes instead of trying to wrap them in padding. Document what the mistakes look like and how not to make them. This is how you get a library that can work well in all circumstances, give your developers the power to investigate and handle odd situations instead of trying to paper-over all the details, because new details will always pop up. Stop treating your developers as stupid; this is a complicated problem with lots of edge-cases on different platforms, so give us the tools to investigate and fix them.

I don't want an environment variable. Libraries should not care about environment variables; if a program wants an option configured, it should configure it. Doing otherwise just introduces the possibility for hidden behavior, with the unexpected bugs and odd interactions that involves. I want to be able to set the hidpi factor scaling at compile time, read it, write it, play with it, experiment with it, find situations where it's broken, submit bug reports to the underlying display library, and work around them when they don't get fixed. I don't want Baby's First Window Library, I want something professional-grade. We can always write a simpler abstraction atop that if people want it.

@chrisduerr
Copy link
Contributor

I don't trust OS's/windowing libraries to get it right in all circumstances

I think this is actually a good point. On X11, it's actually not super uncommon that it's completely impossible to calculate the per-monitor DPI since Randr will report incorrect physical dimensions. Of course this can be fixed by the user, but I don't think I've ever seen someone using X11 that actually has those corrected.

I don't see much reason to allow people to ignore the Xft.dpi though.

So at least on X11 having the Randr scaling optional and only forcing the Xft.dpi scaling would make some sense. I believe that's similar to what Qt does by default too?

Please trust your developers to want to do the right thing.

On X11 that definitely isn't something that I would expect. Quite the opposite, I think most people would rather just ignore DPI and disable it.

Generally I don't think there's much of an issue outside of X11, so I don't think this should be too big of an issue really. The only controversial thing on X11 is automatically applying the Randr scaling which is the best on a big chunks of setups to get per monitor DPI, but will result in severe problems with some of them.

If an application desires to have a custom scaling factor after only Xft.dpi value is applied (with optional Randr scaling), they can still do that internally. But ignoring the Xft.dpi scaling just seems a bit odd to me without any obvious bugs or issues that could report wrong values from this source.

@Osspial
Copy link
Contributor

Osspial commented Mar 29, 2019

I don't trust OS's/windowing libraries to get it right in all circumstances... I want to be able to set the hidpi factor scaling at compile time, read it, write it, play with it, experiment with it, find situations where it's broken, submit bug reports to the underlying display library, and work around them when they don't get fixed.

I agree with the sentiment of this, but I strongly disagree with exposing an API that lets downstream users fix our bugs without submitting patches back to us (which set_hidpi_factor would do). That leads to situations where downstream users investigate our bugs, fix our bugs in their code, and then forget to submit PRs upstreaming their fixes back to us. It's easy to do (I've sure as hell done it in various contexts) and indeed, that's already happened with @fschutt's cool gsettings fix - I expect that any PR implementing that change in our code would've readily been merged, but we never received one. Yes, it was a pain in the ass for them to implement downstream. I'd argue that it should be. People shouldn't be fixing our bugs in their code - instead, if there's a bug in how we interact with the underlying OS, or even a bug in the underlying OS, downstream users are encouraged (although perhaps not as strongly as we'd like) to investigate our code, make fixes in our code, and submit said fixes back to us. I don't want to make design decisions that make it easier for them to not perform that last step.

Also, I trust developers to want to do the right thing. As you said, though, DPI is a complicated issue. If we - as the maintainers of the library intended to abstract over the subtle differences in how different platforms handle windowing - can't get it right on our first go, how can we reasonably expect a casual user to know how to get it right?


The purpose of exposing a DPI override as an environment variable is explicitly to make it hard for developers to use in their applications. Ideally it'd only be used for debugging DPI support and not actually for overriding what the end user sees, which is why I brought up making it debug_assertions-only - doing that would prevent it from possibly having unexpected effects on a users' environment.


Stepping back a moment, let's talk about your specific issues with ggez. The issue you're running into is seems to be that exposing just the logical size values messes up ggez users' assumptions about a single unit lining up with a single pixel, creating ugly spatial aliasing patterns. Is that correct (please correct me if it isn't)? If so, does it make sense to expose physical sizes, and not logical sizes, to your users instead? Exposing that, along with the DPI factor as a scale factor hint, may be the correct choice for ggez, as it still lets people handle DPI correctly but do it in the way that works best for their application.

@mitchmindtree
Copy link
Contributor

For those running into issues with the default behaviour on x11 specifically, the PR at #824 fixes the issue I've been running into where the "Xft.dpi" XResource was not being respected.

On that PR branch I no longer have to specify WINIT_HIDPI_FACTOR=1 to correct the scaling behaviour.

The PR is a rebase of #606 which was ready to go but needed a rebase for quite a while. I suspect it addresses a subset of the problems addressed in this issue, it would be appreciated if someone else running x11 on a hidpi display could take a look and see if it also corrects behaviour on their setup.

@elinorbgr
Copy link
Contributor

elinorbgr commented Mar 30, 2019 via email

@anderejd
Copy link

anderejd commented Mar 31, 2019

On the other hand, it seems pretty obvious that none of us (maintainers of winit) have the temporal or mental bandwidth to handle that...

From my perspective, winit is a fundamental library and it would make perfect sense to have it in https://github.com/rust-lang-nursery

Winit needs sponsoring could greatly benefit from commercial backing . Btw, I love all the work all maintainers and contributors put in.

@icefoxen
Copy link
Contributor Author

icefoxen commented Apr 5, 2019

@Osspial That's more or less my issue, yes. The further issue is that a) the default of winit is to do the exact opposite of what I want, b) nobody seems interested in making my life easier, relegating the Obvious Solution to the realm of debug hacks even though it would be fairly easy to implement and c) I have never, ever gotten a bug report from someone complaining that their pixels aren't displayed at 1.08x the size that they ask for.

DPI is a complicated issue, yes. But it's been a complicated issue for the last 40 years, and nothing about the latest generation of display hardware has made anything any different. Part of the reason it's complicated is there is no one-size-fits-all solution, the best approach to take depends on what your goals are. As far as I can tell, if you are making a GUI application, you ask the GUI framework to handle it for you, and it scales widgets and spacing appropriately with some heuristics to try to keep things looking nice at various sizes, because that's the job of a GUI framework. If you are rendering text, you tell your text renderer and layout engine how big you want the text to appear and what your screen resolution and DPI is like and it does it for you, because text is a VERY complicated special case that has had literally thousands of years of work put into making it look nice. And if you are rendering bitmap graphics where you care about the details of the bitmap, such as a game or image editor, you either scale your graphics by exact integer multiples, you alter your rendering to take into account the physical size of the window (such as altering your OpenGL transform), or you let the user resize the window however looks good to them and live with some edge cases.

None of these benefit from having to do some extra math to find out the actual pixel size of a window or location of a mouse event. The only time an implicit hidpi conversion factor is a benefit is when you are Apple: You control the GUI framework, you control the hardware it's displaying on, and you make a hell of a lot of money from other people making programs using your GUI framework and want to make life a bit easier for them when upgrading the hardware so they keep using your products. They can control the hardware so they only ever have to scale pixels by integer multiples (because that's easy), they can control the GUI framework so it knows exactly the nuances it needs to display well at various scales, and they get to say "of course your app will work well on a Retina display, no changes needed".

In the end, all I want is simple: Have the computer do what I tell it to do, instead of what you think I should want because you're scared of me telling it to do something you don't like. If winit is a fundamental, low-level library, how can you do otherwise?

@anderejd
Copy link

anderejd commented Apr 5, 2019

@icefoxen Your last reply is larger than I have time to properly respond to right now, but I will give it a try.

Today we as developers have to deal with users having everything between multiple 4K+ displays and a single ...1024x768(?) displays on their machines. As a developer to say "I want window of 800x600 physical pixels" doesn't quite cut it anymore. If GGEZ as aiming to provide simple retro graphics, then GGEZ needs to help the developer with scaling, cropping etc. of GL contexts, windows, textures and what not that needs to be involved to have pixel perfect results. Applications and games are moving towards resolution independent rendering, making retro 2d graphics a special case. It's obviously a special case that should be possible to implement on top of winit, but I seems like GGEZ needs to deal with this and provide its users with an API that hides most of the implementation details.

In my opinion, winitshould optimize the API for modern applications, not retro 2d graphics.

That's all I have time to write now, I hope that this reply doesn't come across as a derogatory comment on retro games or GGEZ, I'm a long time gamer myself and do play many kinds of games, including 2d retro style.

@tangmi
Copy link
Contributor

tangmi commented Apr 5, 2019

@icefoxen I think what @Osspial suggested would let you and your users effectively ignore DPI altogether--simply have ggez expose only physical units (i.e. pixels) in its public API and whenever winit gives you a LogicalSize/LogicalPosition, immediately call to_physical on it to convert it to the physical units.

That being said, doing this would mean your users would have to handle DPI themselves if they don't want their game to appear too small on a high resolution display (like a 24" 4K monitor). The alternative would be to expose only logical units (i.e. if a user asks for an 800x600 window, tell them they have an 800x600 window, even though it's actually 1600x1200 pixels). I think this can be done pretty straight-foward in OpenGL by converting logical sizes to pixels when allocating textures and sneaking a scale factor of the hidpi_factor on the x and y axis between the orthographic matrix and clip space. Treat it like multisampling (or supersampling, I guess).

Edit: If one wants to support "pixel-perfect" retro graphics, you can always render to a texture that is sized to the logical size and filter copy that texture to the (physically-sized) backbuffer without much fuss (this would also give ggez an opportunity to apply topical post-processing effects like CRT simulation).

@icefoxen
Copy link
Contributor Author

icefoxen commented Apr 5, 2019

Allow me to clarify some things:

  • Hidpi is a fine solution for some use cases but is not the universal best solution.
  • It is entirely possible for me to convert everything via to_physical(), and that's what I'm in the process of doing.
  • It feels really, really dumb to have winit go through a lot of work and handle a lot of wacky edge-cases entirely for me to try to undo it all. Despite the really nice LogicalSize/PhysicalSize API, there are many odd conditional cases to try to handle.
  • This is not an uncommon case, people generally expect to get what they ask for. Violating literally everyone's expectations all the time is not usually a good default behavior. A user generally can tell the difference between an 800x600 window and a 1600x1200 window, notice when they're being lied to, and get confused. The even worse case comes in when someone asks for an 800x600 window, and gets an 866.66666x650 window, and everything looks subtly wrong but they aren't sure why.
  • Expecting monitors to give sensible, easy to handle values is a bad assumption. The only scaling factors that are easy to handle well are integers. I've seen scaling factors of 1.25, 1.33, 1.5, 1.66, 1.83, and also 1.083. I have a laptop with a 15" 1080p screen that reports a scaling factor of 1.5 for some reason.
  • I am not the only person who needs this functionality.
  • This is not a 2D-specific problem. If you are going to scale 3D graphics properly, you need to know how many pixels you actually need. This is not a concern just for 2D pixel games, it's a concern for anything that creates a framebuffer or renders text or video.
  • The added complexity can make diagnosing bugs much trickier.

Despite what people keep claiming, this is not a new problem. This is a problem that gamers and gamedevs have been dealing with for 25 years. Attempting to solve it by lying to the developer and user by default makes actually solving it harder. If you are trying to make winit a low-level library, as it says in the readme, you should make it a low-level library. This means allowing your users to make the decisions on what they want, instead of trying to tell them what they want.

In conclusion,

image

@icefoxen icefoxen closed this as completed Apr 5, 2019
@fkaa
Copy link
Contributor

fkaa commented Apr 5, 2019

@icefoxen would the window size problem be alleviated by adding explicit physical variants for Window/WindowBuilder functions (eg. set_physical_position, set_inner_physical_size)?

@anderejd
Copy link

anderejd commented Apr 5, 2019

@icefoxen The screens shots you posted indicates that you misinterpret what people (me?) suggested here. No one is saying that we should start present settings in logical pixels to end-users / gamers.

Violating literally everyone's expectations all the time is not usually a good default behavior.

Leverage the type system and make it super obvious when conversions are needed. If you accept simple numeric types as input, then making them physical pixels is probably a good direction for ggez?

@Osspial
Copy link
Contributor

Osspial commented Apr 5, 2019

  • Hidpi is a fine solution for some use cases but is not the universal best solution.

  • It is entirely possible for me to convert everything via to_physical(), and that's what I'm in the process of doing.

  • It feels really, really dumb to have winit go through a lot of work and handle a lot of wacky edge-cases entirely for me to try to undo it all. Despite the really nice LogicalSize/PhysicalSize API, there are many odd conditional cases to try to handle.

That's... fair enough, actually. Winit's currently extremely opinionated that everything should be done in terms of LogicalSize, but since that isn't always the case we really shouldn't be making that assumption. I don't see any reason Winit can't handle the physical/logical size conversions behind the scenes, and then present whichever the user asks for when they perform the relevant API calls.

You seem pretty frustrated that you've brought up all these concerns without us doing much of anything, which is entirely reasonable! Would an API along these lines go some ways to alleviate that concern?

pub trait Position {
    fn to_physical(self, hidpi_factor: f32) -> PhysicalPosition;
    fn to_logical(self, hidpi_factor: f32) -> LogicalPosition;
}

impl Position for PhysicalPosition {/*impl*/}
impl Position for LogicalPosition {/*impl*/}

impl Window {
    /// User calls `window.get_position::<LogicalPosition>()` or
    /// `window.get_position::<PhysicalPosition>()`, or uses type
    /// inference to determine the return type.
    pub fn get_position<P: Position>(&self) -> P {..}

    /// User can provide either `LogicalPosition` or `PhysicalPosition`
    /// and Winit will do the conversion internally.
    pub fn set_position<P: Position>(&self, position: P) {..}
}

(I'm aware that specific API wouldn't work for Event types but there are other ways we can expose both physical and logical values in that context)

The even worse case comes in when someone asks for an 800x600 window, and gets an 866.66666x650 window, and everything looks subtly wrong but they aren't sure why.

Again, fair enough. I'm honestly not entirely sure why we expose fractional pixel values since there's really no sensible way for a user to handle them; I'd like to look into how OSes handle fractional logical pixels to see if there's a way we could just expose them as integers, but that requires some investigations across all the platforms.

  • Expecting monitors to give sensible, easy to handle values is a bad assumption. The only scaling factors that are easy to handle well are integers. I've seen scaling factors of 1.25, 1.33, 1.5, 1.66, 1.83, and also 1.083. I have a laptop with a 15" 1080p screen that reports a scaling factor of 1.5 for some reason.

Part of that is because, on X11 at least, the DPI scaling factor is manually calculated by Winit based on the monitor's reported size, rather than relying on what the OS says. There was a PR in the works a while ago that @mitchmindtree recently rebased which fixes that behavior (see #824), but I'd like to have someone else test that out before merging it. However, if everything goes well that fix should be in the next patch release!

@PSteinhaus
Copy link

PSteinhaus commented Jun 4, 2021

@Osspial hi there, new ggez co-maintainer here.

Thanks for trying to provide help.
But even though we went all the way and changed our system to only use physical sizes we still run into hidpi issues that we simply cannot fix by applying to_physical conversions anymore.

EDIT: Thinking more about the ScaleFactorChanged event I wonder why we even have our issue... Looking at how this event is handled in winit on macos it looks to me as if winit is already doing its best to respect our choice to not resize the window, but the OS just goes ahead and does it anyway, inbetween, just before winit applies our chosen size...

I guess I'l open up a new issue for this.
Might be related to #791.

@chamons
Copy link

chamons commented Sep 5, 2022

Not to dig up a old closed issue, but for any ggez users who come across this ggez/ggez#1091 likely covers some of your high dpi specific issues.

@dhardy
Copy link
Contributor

dhardy commented Sep 5, 2022

Not to stoke an old issue, but to provide some further perspective on the API, writing a GUI toolkit, I also have zero use for winit's "logical pixels". Rendering (esp. of text) needs to happen using physical pixels. Placement could use logical pixels, but would have to use floating point formats and would be more prone to rounding issues; in contrast physical pixels using integers work perfectly. Just about the only use of "logical pixels" is to specify the intended size of features, and the code for this has no interaction with winit (feature sizes are converted to physical pixels before being positioned to ensure pixel-alignment).

There are a couple of exceptions:

  • Calculating size requirements for a window requires a scale factor, which isn't really available before the window is constructed (though it is possible to check the factor of available displays)
  • Wayland only supports a window's size specification using logical pixels. My toolkit can (and does) calculate this assuming a scale factor of 1, construct the window, then re-calculate the layout using the actual scale factor and physical-pixel size. This isn't ideal (result may be a few pixels off) but it mostly works.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests