-
Notifications
You must be signed in to change notification settings - Fork 911
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Highdpi is awful. #796
Comments
@icefoxen can you please let us know the exact issues you are getting? Is the incorrect window size the only issue? Which platforms are you experiencing issues with? What specifically happens on each one that you don't like? Also you've probably already read this, but the winit::dpi module-level docs is an essential read for understanding the way that DPI in winit works and why it works the way it does. There are also some known issues on X11 which I think are fixed by #606 but that PR is waiting on a rebase I believe. For the record, I'm very happy with my "hidpi" experience in winit since the francesca's recentish overhaul (other than the issues addressed by #606), but I haven't used windows since then. |
Yeah, give me a few days to compile a list of bugs. |
Also fschutt/azul#61 - I "solved" it by shelling out to /// Return the DPI on X11 systems
#[cfg(target_os = "linux")]
fn linux_get_hidpi_factor(monitor: &MonitorId, events_loop: &EventsLoop) -> f64 {
use std::process::Command;
use glium::glutin::os::unix::EventsLoopExt;
let winit_dpi = monitor.get_hidpi_factor();
let winit_hidpi_factor = env::var("WINIT_HIDPI_FACTOR").ok().and_then(|hidpi_factor| hidpi_factor.parse::<f64>().ok());
let qt_font_dpi = env::var("QT_FONT_DPI").ok().and_then(|font_dpi| font_dpi.parse::<f64>().ok());
// Execute "gsettings get org.gnome.desktop.interface text-scaling-factor" and parse the output
let gsettings_dpi_factor =
Command::new("gsettings")
.arg("get")
.arg("org.gnome.desktop.interface")
.arg("text-scaling-factor")
.output().ok()
.map(|output| output.stdout)
.and_then(|stdout_bytes| String::from_utf8(stdout_bytes).ok())
.map(|stdout_string| stdout_string.lines().collect::<String>())
.and_then(|gsettings_output| gsettings_output.parse::<f64>().ok());
let options = [winit_hidpi_factor, qt_font_dpi, gsettings_dpi_factor];
options.into_iter().filter_map(|x| *x).next().unwrap_or(winit_dpi)
} For example, my display has a (calculated) DPI factor of 1.25, but a gsettings factor of 1.0 - it would be nice if winit would have a |
Ok, sorry for being grumpy. After some thought and some time doing things other than fighting frustrating bugs, I agree that a So, advantages to having winit have a
Thoughts? |
I'm going to ask for details, because I think I don't have a good enough understanding of what you want to achieve.
The current HiDPI model of winit is as follows:
- everything (coordinates, sizes) is given as logical coordinates, with the types reflecting it (LogicalPosition / LogicalSize)
- it is advised that all processing apart for drawing is done using logical coordinates, to be dpi-agnostic
- when physical coordinates are needed, the to_physical methods allows translation by providing to it the hidpi factor given by winit in the appropriate event
I believe the typedness of locations and sizes should prevent most misinterpretation between logical and physical coordinates.
Now if I understand correctly, you propose to add a method that is basically the equivalent of the WINIT_HIDPI_FACTOR environment variable used in x11 contextes. To be honest, I don't understand how this may help with anything given the above...
So, could you elaborate on what part of this HiDPI model is problematic?
|
Just to chime in from personal experience, the only issue I can see with dpi scaling would be that the window is automatically resized. A lot of code of Alacritty uses physical dimensions internally and in general that doesn't create any issues since it can just be converted. Maybe if you really don't want to deal with resizing due to DPI, being able to just disable that would be enough? |
Something that helped me understand DPI was to put "logical-" in front of anything (i.e. all application logic) that wasn't directly dealing with GPU textures (and use "physical-" for actual screen pixels). For example, in 2D game, I might have a viewport that I want to be 800 pixels wide. I read that as 800 logical pixels wide, which needs a conversion if I want to figure out exactly how big that backing texture needs to be (if the |
I don't think I want to add a Regarding @fschutt using |
Okay, a few problems:
I know that shelling out to gsettings isn't a "clean" solution, but it happens to work very well and also gives users control to change the DPI at runtime via the system settings. XResources also doesn't have a per-monitor DPI value, just a global DPI value. Getting the DPI from the XResources was tried already, see: #543 and #169 (comment) |
@fschutt Ah, good to know that the obvious solution isn't, in fact, the best solution! I brought up XResources because it does indeed seem more clean than calling gsettings, but I'd be totally fine with shelling out to gsettings on systems where it's available and falling back to XResources when it isn't. Even doing that would leave X11 DPI support in a much better state than it's in today.
We'd like to provide a windowing API that gives functional DPI support for all of our users. Our current solution for X11 hiDPI factor admittedly doesn't work particularly well, but you've come up with a solution that's demonstrably better than ours; it would be foolish for us to not pull it upstream and allow all our users to use it, and even more foolish to somehow revert that change as you seem to imply we'd do. Also, what context are you bringing up "inverse window-scaling" in? Without knowing that it's hard to know how a |
@Osspial The "inverse" scaling factor is ( So whether you introduce a |
I'd like to note that even on systems where I personally have never heard anyone complain in Alacritty that it's not possible to change its DPI scaling through |
fschutt's "I don't trust winit not to break it again" is a bit of a hot take on the matter, but I agree with it from a different angle: I don't trust OS's/windowing libraries to get it right in all circumstances, and as a developer I want a blunt tool to be able to force it to do what I want. If you have a This is exactly the main problem I have: Please trust your developers to want to do the right thing. Let developers make mistakes instead of trying to wrap them in padding. Document what the mistakes look like and how not to make them. This is how you get a library that can work well in all circumstances, give your developers the power to investigate and handle odd situations instead of trying to paper-over all the details, because new details will always pop up. Stop treating your developers as stupid; this is a complicated problem with lots of edge-cases on different platforms, so give us the tools to investigate and fix them. I don't want an environment variable. Libraries should not care about environment variables; if a program wants an option configured, it should configure it. Doing otherwise just introduces the possibility for hidden behavior, with the unexpected bugs and odd interactions that involves. I want to be able to set the hidpi factor scaling at compile time, read it, write it, play with it, experiment with it, find situations where it's broken, submit bug reports to the underlying display library, and work around them when they don't get fixed. I don't want Baby's First Window Library, I want something professional-grade. We can always write a simpler abstraction atop that if people want it. |
I think this is actually a good point. On X11, it's actually not super uncommon that it's completely impossible to calculate the per-monitor DPI since Randr will report incorrect physical dimensions. Of course this can be fixed by the user, but I don't think I've ever seen someone using X11 that actually has those corrected. I don't see much reason to allow people to ignore the Xft.dpi though. So at least on X11 having the Randr scaling optional and only forcing the Xft.dpi scaling would make some sense. I believe that's similar to what Qt does by default too?
On X11 that definitely isn't something that I would expect. Quite the opposite, I think most people would rather just ignore DPI and disable it. Generally I don't think there's much of an issue outside of X11, so I don't think this should be too big of an issue really. The only controversial thing on X11 is automatically applying the Randr scaling which is the best on a big chunks of setups to get per monitor DPI, but will result in severe problems with some of them. If an application desires to have a custom scaling factor after only Xft.dpi value is applied (with optional Randr scaling), they can still do that internally. But ignoring the Xft.dpi scaling just seems a bit odd to me without any obvious bugs or issues that could report wrong values from this source. |
I agree with the sentiment of this, but I strongly disagree with exposing an API that lets downstream users fix our bugs without submitting patches back to us (which Also, I trust developers to want to do the right thing. As you said, though, DPI is a complicated issue. If we - as the maintainers of the library intended to abstract over the subtle differences in how different platforms handle windowing - can't get it right on our first go, how can we reasonably expect a casual user to know how to get it right? The purpose of exposing a DPI override as an environment variable is explicitly to make it hard for developers to use in their applications. Ideally it'd only be used for debugging DPI support and not actually for overriding what the end user sees, which is why I brought up making it Stepping back a moment, let's talk about your specific issues with |
For those running into issues with the default behaviour on x11 specifically, the PR at #824 fixes the issue I've been running into where the "Xft.dpi" XResource was not being respected. On that PR branch I no longer have to specify The PR is a rebase of #606 which was ready to go but needed a rebase for quite a while. I suspect it addresses a subset of the problems addressed in this issue, it would be appreciated if someone else running x11 on a hidpi display could take a look and see if it also corrects behaviour on their setup. |
On the one hand, I think the question of HiDPI handling would really need to go through some kind of RFC-like process, to put on the table all the needs of winit users, the practical constraints of the different platforms, and figure out a way to abstract/unify all that as best as possible.
On the other hand, it seems pretty obvious that none of us (maintainers of winit) have the temporal or mental bandwidth to handle that...
|
From my perspective, Winit |
@Osspial That's more or less my issue, yes. The further issue is that a) the default of winit is to do the exact opposite of what I want, b) nobody seems interested in making my life easier, relegating the Obvious Solution to the realm of debug hacks even though it would be fairly easy to implement and c) I have never, ever gotten a bug report from someone complaining that their pixels aren't displayed at 1.08x the size that they ask for. DPI is a complicated issue, yes. But it's been a complicated issue for the last 40 years, and nothing about the latest generation of display hardware has made anything any different. Part of the reason it's complicated is there is no one-size-fits-all solution, the best approach to take depends on what your goals are. As far as I can tell, if you are making a GUI application, you ask the GUI framework to handle it for you, and it scales widgets and spacing appropriately with some heuristics to try to keep things looking nice at various sizes, because that's the job of a GUI framework. If you are rendering text, you tell your text renderer and layout engine how big you want the text to appear and what your screen resolution and DPI is like and it does it for you, because text is a VERY complicated special case that has had literally thousands of years of work put into making it look nice. And if you are rendering bitmap graphics where you care about the details of the bitmap, such as a game or image editor, you either scale your graphics by exact integer multiples, you alter your rendering to take into account the physical size of the window (such as altering your OpenGL transform), or you let the user resize the window however looks good to them and live with some edge cases. None of these benefit from having to do some extra math to find out the actual pixel size of a window or location of a mouse event. The only time an implicit hidpi conversion factor is a benefit is when you are Apple: You control the GUI framework, you control the hardware it's displaying on, and you make a hell of a lot of money from other people making programs using your GUI framework and want to make life a bit easier for them when upgrading the hardware so they keep using your products. They can control the hardware so they only ever have to scale pixels by integer multiples (because that's easy), they can control the GUI framework so it knows exactly the nuances it needs to display well at various scales, and they get to say "of course your app will work well on a Retina display, no changes needed". In the end, all I want is simple: Have the computer do what I tell it to do, instead of what you think I should want because you're scared of me telling it to do something you don't like. If winit is a fundamental, low-level library, how can you do otherwise? |
@icefoxen Your last reply is larger than I have time to properly respond to right now, but I will give it a try. Today we as developers have to deal with users having everything between multiple 4K+ displays and a single ...1024x768(?) displays on their machines. As a developer to say "I want window of 800x600 physical pixels" doesn't quite cut it anymore. If GGEZ as aiming to provide simple retro graphics, then GGEZ needs to help the developer with scaling, cropping etc. of GL contexts, windows, textures and what not that needs to be involved to have pixel perfect results. Applications and games are moving towards resolution independent rendering, making retro 2d graphics a special case. It's obviously a special case that should be possible to implement on top of In my opinion, That's all I have time to write now, I hope that this reply doesn't come across as a derogatory comment on retro games or GGEZ, I'm a long time gamer myself and do play many kinds of games, including 2d retro style. |
@icefoxen I think what @Osspial suggested would let you and your users effectively ignore DPI altogether--simply have That being said, doing this would mean your users would have to handle DPI themselves if they don't want their game to appear too small on a high resolution display (like a 24" 4K monitor). The alternative would be to expose only logical units (i.e. if a user asks for an 800x600 window, tell them they have an 800x600 window, even though it's actually 1600x1200 pixels). I think this can be done pretty straight-foward in OpenGL by converting logical sizes to pixels when allocating textures and sneaking a scale factor of the Edit: If one wants to support "pixel-perfect" retro graphics, you can always render to a texture that is sized to the logical size and filter copy that texture to the (physically-sized) backbuffer without much fuss (this would also give |
Allow me to clarify some things:
Despite what people keep claiming, this is not a new problem. This is a problem that gamers and gamedevs have been dealing with for 25 years. Attempting to solve it by lying to the developer and user by default makes actually solving it harder. If you are trying to make winit a low-level library, as it says in the readme, you should make it a low-level library. This means allowing your users to make the decisions on what they want, instead of trying to tell them what they want. In conclusion, |
@icefoxen would the window size problem be alleviated by adding explicit physical variants for Window/WindowBuilder functions (eg. |
@icefoxen The screens shots you posted indicates that you misinterpret what people (me?) suggested here. No one is saying that we should start present settings in logical pixels to end-users / gamers.
Leverage the type system and make it super obvious when conversions are needed. If you accept simple numeric types as input, then making them physical pixels is probably a good direction for ggez? |
That's... fair enough, actually. Winit's currently extremely opinionated that everything should be done in terms of You seem pretty frustrated that you've brought up all these concerns without us doing much of anything, which is entirely reasonable! Would an API along these lines go some ways to alleviate that concern? pub trait Position {
fn to_physical(self, hidpi_factor: f32) -> PhysicalPosition;
fn to_logical(self, hidpi_factor: f32) -> LogicalPosition;
}
impl Position for PhysicalPosition {/*impl*/}
impl Position for LogicalPosition {/*impl*/}
impl Window {
/// User calls `window.get_position::<LogicalPosition>()` or
/// `window.get_position::<PhysicalPosition>()`, or uses type
/// inference to determine the return type.
pub fn get_position<P: Position>(&self) -> P {..}
/// User can provide either `LogicalPosition` or `PhysicalPosition`
/// and Winit will do the conversion internally.
pub fn set_position<P: Position>(&self, position: P) {..}
} (I'm aware that specific API wouldn't work for
Again, fair enough. I'm honestly not entirely sure why we expose fractional pixel values since there's really no sensible way for a user to handle them; I'd like to look into how OSes handle fractional logical pixels to see if there's a way we could just expose them as integers, but that requires some investigations across all the platforms.
Part of that is because, on X11 at least, the DPI scaling factor is manually calculated by Winit based on the monitor's reported size, rather than relying on what the OS says. There was a PR in the works a while ago that @mitchmindtree recently rebased which fixes that behavior (see #824), but I'd like to have someone else test that out before merging it. However, if everything goes well that fix should be in the next patch release! |
@Osspial hi there, new ggez co-maintainer here. Thanks for trying to provide help. EDIT: Thinking more about the
|
Not to dig up a old closed issue, but for any ggez users who come across this ggez/ggez#1091 likely covers some of your high dpi specific issues. |
Not to stoke an old issue, but to provide some further perspective on the API, writing a GUI toolkit, I also have zero use for winit's "logical pixels". Rendering (esp. of text) needs to happen using physical pixels. Placement could use logical pixels, but would have to use floating point formats and would be more prone to rounding issues; in contrast physical pixels using integers work perfectly. Just about the only use of "logical pixels" is to specify the intended size of features, and the code for this has no interaction with winit (feature sizes are converted to physical pixels before being positioned to ensure pixel-alignment). There are a couple of exceptions:
|
Making this an actual issue here so I can get around to doing something about it someday. It's been a pain in my ass since forever; people using ggez constantly ask why they get a 1200x900 pixel window when they ask for an 800x600 one, and how in the name of Eris they can turn it off.
Basically, if you are trying to make something pretty, you NEED to be able to actually get accurate information on where the hell pixels are, without things trying to hide it from you. The easiest way to do this is to actually just turn off any highdpi nonsense, rather than forcing the user to try to figure out everywhere they need to put in a conversion factor. Does hidpi scaling apply to location of mouse events, for example? I don't even know.
To demonstrate why this can be a problem, observe this program: https://github.com/icefoxen/heckin_dpi . You can turn on MSAA or such but you still get a jerky effect where the lines walk across the screen slightly out of sync with each other, even though the actual mathematical distance between the lines is always the same.
The text was updated successfully, but these errors were encountered: