-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Virtual Display Mode Support #1244
base: master
Are you sure you want to change the base?
The head ref may contain hidden characters: "virtual-display\u2014mode-support"
Virtual Display Mode Support #1244
Conversation
There are some things in this PR that shouldn't be here - I'm going to make a pass through and pull them. Most notably the cursor scale stuff ended up in here again. |
9df8d56
to
c4927d6
Compare
Saving out display mode ratio of max res, and adding support for client to set resolution in response to window events Initial work on virtual display mode support Saving out display mode ratio of max res, and adding support for client to set resolution in response to window events
c4927d6
to
620b915
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm going to need a bit of convincing that a new hsIniHelper is needed vs using (or improving) something like the existing plIniNoSectionsConfigSource and related plContainer classes
#include "plFileSystem.h" | ||
#include "hsStream.h" | ||
|
||
#endif /* hsIniHelper_hpp */ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This #endif
is in the wrong spot, should be at the end of the file
I'll check this out. I had looked through and hadn't found any existing class to deal with ini files - but I missed this chunk. |
@dpogue - I implemented a version with plIniNoSectionsConfigSource. There is an issue and it's silly. Python ini files are written with space as the separator. So: The family of classes you pointed me to uses Python and this class cluster are not compatible and just spend all day arguing over formatting. This class cluster will do really weird things like turn the Python ini entries into: |
Also worth noting why this probably is: Graphics.ini is not really an ini file. It's a script of console commands that gets executed when Plasma launches. It doesn't help everything else (the file extension, the Python code) all pretends it's ini files with the same formatting as the other ini files. |
Following back up on this - @dpogue - One possible way to move ahead on plIniNoSectionsConfigSource is to allow a custom separator between the key and value. That would involve modifying plIniNoSectionsConfigSource and friends, but might be preferable to defining a new class. |
@Hoikas - This is the PR I was talking about for resolution switching/things not falling apart when changing displays. It records the percentage of native resolution, and persists that ratio between display changes and window resizes. I wasn't completely happy with where this ended up - and it requires the extra setting. It also requires native code to be able to edit ini's to update the graphics ini - which isn't something that exists today. Maybe someone better with Python could craft a better change. I think one issue I found is that some display stuff needs to be running before Python or options is stood up. |
@dgelessus, @dpogue, @Hoikas - Looking for feedback here. The Mac Metal branch currently doesn't handle resolution switching or the startup resolution correctly. The Mac demo at Mysterium also included this branch. I'm not attached to how this works - but don't have any better ideas so far either. Basically - it would be nice to have Plasma not reset all graphics settings when moving displays. And it would be nice to have a way to deal with resolutions when the display's actually modes are no longer important and everything is software upscaling. |
Hmm, after reading over the discussion and the code, I'm a bit confused and it feels like I'm missing some context... The core problem is that when the computer's display configuration changes, the game's current graphics mode may become unavailable, which currently causes a hard reset of the graphics settings. The intent of this change is to handle this gracefully by switching to a different resolution appropriate for the new display configuration. This seems reasonable and logical to me. The proposed solution with the "output scale" is what I'm confused about. If I'm understanding it right, this will keep the game resolution the same relative to the display resolution. So for example, if I'm running the game at 1920x1080 on a 4K (3840x2160) display and then switch to a 2560x1440 display, the game will change its resolution to 1280x720 to remain at 50 % of the display size. But is that a good default behavior? When running in fullscreen at the maximum resolution, it usually makes sense to switch to the new maximum resolution. But in other cases, I think it's odd to automatically change the game resolution to maintain the scale relative to the monitor, rather than staying as close as possible to the previous resolution. The problem is that we only know what resolution the player currently wants - we don't know why they chose that resolution and what they'd prefer on a different display. Because it's impossible to guess the player's intent correctly all the time, I think a simple and predictable solution would be better. IMO, we should always keep the current resolution if possible, or otherwise switch to the closest available one. A special case for staying at maximum resolution in fullscreen would make sense, because that's a very common case - but otherwise, we shouldn't try to guess too much. |
Render scale is a pretty common solution used by games. I did consider closest res because it would fit better with Plasma. Consider a situation in which a user is playing Uru on a laptop, and is switching back and forth between using an internal and external display. Maybe they grab the laptop and play in another room sometimes, and then when they're at their desk they play on their big display. Picking closest resolution would allow the resolution on both displays to drift over time. It works at the first changeover when you change from the laptop display to an external display. But when switching back the resolution closest to the external display resolution on the laptop panel - it could be a different resolution than the original laptop panel resolution. And then when switching back to the external display it could drift further. Algorithms that work that way typically get more complicated. They track display IDs and then try to remember the resolution used for a specific display to prevent drift. That means saving a list of displays Plasma has seen - which seems like its own mess. Another issue is that macOS is moving away from display modes - so we also need to invent our own resolutions. I'd have to look to see if the display mode list API still works on modern hardware - but you definitely don't invoke a display resolution change anymore. And most games make up their own display modes now. |
Took another look at this, and I think the challenge with this PR as it stands is that it's all the plumbing work but none of the implementation work at the pipeline level. I think it might be better to start with the pipeline side and support for parsing the console command (to allow testing via the console or via hand-editing a config file), and then add the options ini side of things after that's reviewed. |
Oh, I forgot other issues I ran into when I tried closest res... If the displays involved are different aspect rations, closest res can get even more messy. I.E. your laptop display is widescreen but your desk display is an ultrawide. I wasn't able to find an algorithm that could convincingly deal with that scenario - and the drift gets even worse. Also windowed mode is an issue. It would be nice to be able to grab an Uru window and resize it, or move it between two different displays without an issue. As it is today - windowed mode is locked to the display modes of the display on Windows. You can't grab the window edges - you have to go into options and resize. And if you grab the window and move it to another display with other display modes - technically the window might need to resize itself.
I can plumb it in. I'd need to look at DX classic because I'm not sure DX classic can reconfigure its frame buffer size. Metal allows you to pick your own frame buffer size at any time. DX might require an offscreen render to a custom frame buffer size - which could be it's own mess. |
In my experience, "render scale"/"resolution scale"/"upscaling" is offered as a separate option in addition to the actual resolution. That's mainly because the games usually render the GUI/HUD at native resolution and only scale the 3D scene, but it has the advantage that the user can specifically say "render at 50 % scale, no matter what the display resolution is". The problem is that Uru/Plasma only has one resolution setting, which until now has meant "resolution" and not "scale", so it would be confusing to change that now.
Unfortunately, you'll get the same kind of drift if you choose the new resolution by "relative scale". For example, I usually play games windowed at the highest possible resolution that doesn't overlap the taskbar (one or two resolutions below the maximum). That doesn't translate nicely to a percentage scale and back.
But you have the same issues with a relative scale, right? If I have the game in a 16:9 window on an ultrawide monitor, that doesn't necessarily mean I want it downsized or horizontally squished when I switch to a regular widescreen monitor.
No, it doesn't? At least on Windows, just moving a window to a smaller screen doesn't automatically change its size. This is different on macOS, where moving a large window to a small screen will limit the window size to what fits on the small screen - but that's also temporary and reverts once you move it back to a larger screen. In any case, moving the Uru window should never cause any permanent changes to the resolution setting. I agree though that we should support resizing the Uru window by dragging the edges, etc., and not just using the resolution slider in the graphics settings. |
With macOS, you might be dragging from a retina screen (like a laptop display) to a non-retina screen (like an external monitor) which needs to change the resolution of the window but not its (perceived) size. |
Lemme grab a few other Mac games and see what they do. I wrote a test script that queries macOS for display modes on my M1 MacBook Pro - and it's still returning display modes. Last time I looked at world Relative scale doesn't drift because the Mac (right now) doesn't use display modes at all. We can pick any arbitrary resolution we want. So when transferring displays, we pick an arbitrary resolution that matches the same scale. Float drift is possible but this PR has some things to compensate for that.
It's not a limitation in Windows - it's technically a limitation in Plasma. Plasma is supposed to restrict its window size to one of the available display modes for the current display. Windows is just fine with keeping the window the same size as it moves displays, Plasma is not. I don't have dual displays on my Windows box so I haven't tried it (I really should.) But in theory, with Plasma's defined behaviors, if the Plasma window is moved from one display to another, and the new display does not have the same display mode, all graphics settings will be flipped back to defaults. That may not be what's happening right now - but that's how Plasma is supposed to deal with display changes. |
Okay, I am actually using a retina MacBook with non-Retina external screens, but that has nothing to do with it - macOS has no trouble keeping the logical window size the same when moving to a monitor with a different pixel density. The behavior I'm describing is only when you move a window onto a screen that's smaller than the window itself. If the window is already small enough to fit, its size doesn't change.
I have dual monitors on Windows and I can move the Uru window between screens just fine, or even have it span both screens. This is why I'm so confused about what needs changing in windowed mode - as far as I can tell, everything works perfectly already. I don't know DirectX and Plasma's graphics code well, but as far as I can tell, the supported display modes are queried for the graphics adapter as a whole, not any specific screen or window, so moving the game between screens cannot change the available display modes. I don't know how the list of modes is constructed when there are multiple monitors - I assume it returns every mode supported by any attached monitor? I can't test this easily, because all the modes of my smaller monitor are also supported by my larger monitor. I also tried un-/plugging the larger monitor while Uru is running. That turns the game window black, even if I set the game to a resolution that fits on the small monitor (800x600). Once I plug the large monitor back in, the game renders correctly again. So clearly there's something monitor-specific going on, but I'm not sure what. (Interestingly, none of this resets the graphics settings.) |
There's two bounds systems at play. There is the AppKit bounds that remains constant. But there are the window's backing bounds - which do change. https://developer.apple.com/documentation/appkit/nsscreen/1388389-convertrecttobacking?language=objc Backing is what is used for Metal and OpenGL apps. So moving between a Retina and non-Retina display will change an NSWindow's geometry. It will not change the bounds - but the backing is the important metric, not the bounds. How to keep the resolution consistent when moving a window between displays when the backing resolution changes is covered by this PR. I'm not sure this PR is the only possible solution - but it is something to address.
This could actually be a bug in Plasma. I would say the bug is probably a good thing - but it's technically not the behavior Plasma has defined. Plasma on Windows is supposed to be locked to the current display's video modes, and if it ever departs from that everything is reset back to defaults. This means if you plug in a new display all your graphics settings are reset if the mode is different. When Uru was first written - this may not have been critical. Every display on the market probably supported 800x600 or maybe even 1024x1024. Widescreen displays still weren't completely common, and changing displays or having dual displays was even more rare. So this logic probably held up. Now when I take my MacBook Pro and plug it into my 5k display - things don't hold up. Plasma is querying the display modes of a display. Offhand - I think I've seen code that it's supposed to be querying the display modes of the display the Plasma window is on. But they may not have accounted for the window moving.
I think there is a bug here where Plasma is supposed to execute a reset of the DirectX pipeline - which would fix the black screen - but also reset the graphics settings. I don't think it's doing so with a display change which is probably an artifact of my above comment where there was a whole bunch of stuff they didn't worry about because it would have been rare in the early 2000s. Generally on this PR: The other reason I might move it out into its own issue is because there is some discussion of the Windows and Direct3D pipeline going on here. I wasn't originally going to address Windows. OpenGL and Metal both have ways of dealing with frame buffer sizes independent of display modes. D3D9 still locks resolution to available display modes when in full screen. So any discussion of fixing issues on Windows might extend beyond the scope of this PR. |
I should clarify: the behavior I described above is with regular desktop applications, like the Finder or Firefox, where the user only cares about the "logical" window size (bounds?) and doesn't think about the "physical" (backing?) resolution. It's quite possible that games handle this differently. I normally don't play games on my retina MacBook, so I don't have any experience with how Mac games work in mixed DPI setups.
In the context of moving between different DPIs, your proposed solution makes a bit more sense. But if that is the main issue, wouldn't it be better to save the screen's actual scaling factor (e. g. 2x for retina, 1x for regular DPI) and adjust the resolution based on that? For example, if the game is running in a 1280x720 window on a 1920x1080 non-retina monitor and I then move it to a 5K (5120x2880) retina monitor, I would expect the resolution to change to 2560x1440, because then the window is still "logically" 1280x720 on the new monitor. In fullscreen, I would expect either the same behavior (keeping the "logical" resolution) or staying at 1280x720 (keeping the "physical" resolution). In either case, I would not expect it to switch to 3413x1920 or whatever, just because that maintains the width relative to the screen.
You might be thinking of |
I think there's a few things being conflated here. This PR doesn't work like you described. On macOS - the ratio is maintained as a product of the window's backing. On the non-Retina display, the backing size would be 1280x720. At a ratio of 1, the game would render at 1280x720. When the window is moved to the Retina display, the backing will become 2560x1440. The game will maintain the 1:1 ratio and will begin rendering at 2560x1440 - like you would want. The max resolution of the display is irrelevant unless the window is full screen. The macOS client does not care about display modes - only window size. It has no idea the monitor has a resolution of 5120x2880 unless the window is made that size or brought to full screen.
Just to re-enforce the above - on Windows Plasma checks display modes. On macOS - there is no display mode checking. I think this code accounts for the window moving - but my guess is that Plasma does not subscribe to the right callback to know when the window has moved. I'm pretty sure I have seen the code that traces a window back to its parent display. I'll take a look if I get a chance.
The DirectX pipeline has a series of "reset" functions (e.g. IResetDevice) that basically nuke and pave everything any time a display configuration changes. It's a bit heavy handed - I think the Direct3D pipeline has both hard and soft resets. If you're getting a black screen - that might be implying that the Direct3D pipeline is not entering a reset when it needs to. On the Mac game front - I'm continuing to grab recent ports. World of Warcraft actually will only let you pick the resolution on some hardware - which I doubt will do anything to help this conversation. I can go over why they do that once I have the other games figured out. I specifically want to figure out how the other games handle display changes. |
Resize is another function in the D3D pipeline that does resetty things. I thought it was required by the abstract pipeline interface (GL and Metal also both implement it) but it comes with this curious note:
|
Here's what I'm finding with most games. I don't know if the answer will make anyone happy. What it seems like is - every game I tried will default to using the full resolution of the panel if the display is ever changed. So if I play on my laptop, the game will default to full res. If I play on my desktop display, it will default to full res. Go back to the laptop - full res again. Even if I explicitly set the resolution on one display, it will go back to full res after I cycle through displays. Render scale is also available in all these games - and render scale is maintained no matter how many times you switch displays. Most games encourage you to use render scale instead of resolution switching because it keeps the UI sharp while changing the rendering resolution. (I think that is a feature Uru should adopt. I'd love to see the cursor and the UI always rendered at full resolution. That may be another discussion, but it would depend on the outcome of this PR.) The only exception I saw was World of Warcraft. When on a non-rectangular display (like most of the Apple Silicon laptops) it will not offer any resolution aside from the native resolution. This may be because it draws using the full panel including the notches at the top, which the display modes don't account for. Render scale was the only way to change the render resolution in this scenario. So one possible easy fix is if the display modes don't match what is saved - switch to the panel resolution. But in most games that is usually coupled by a render scale - which this PR introduces. |
Another thing worth noting - in since we're in a classic Direct3D context: None of the games actually changed the display resolution. They queried the display modes - but instead would set the Metal output size to a resolution matching that display mode. I haven't tried full screen resolution switching on Windows in a bit. So I don't know if Windows still switches the actual display resolution for DX9. But that's not how DX9 typically does things at an API level. DX9 really wants a valid display mode when in full screen. |
So - after some more investigation - display modes are actually kind of broken on most Apple Silicon laptops. World of Warcraft is correct to disable resolution switching on current MacBook Airs and MacBook Pros. If I do a display mode query on a M1 MacBook Pro 16", I get a lot of modes, but we'll look at these two specifically just as an example:
These modes are identical except for one thing - the height. Why? The panel resolution of an M1 MacBook Pro 16" is 2234x3456. But the addressable region - by an AppKit application using the modern full screen API - is only 2160x3456. This is because a full screen AppKit application (at least using the modern API) cannot extend its content into the "ears" around the notch at the top. Basically, CoreGraphics will offer a bunch of modes that the panel can technically do but a full screen AppKit app cannot. Blizzard seems to have basically thrown down the towel and only offers render scaling on these displays. Strangely enough other games - like No Mans Sky - seem to stumble into this with no recognition of the problem. No Mans Sky defaults to 2234x3456 in a 2160x3456 window - leading to a slightly distorted image. First - I'm inclined to actually go file a bug with Apple because it feels like game developers are using their display modes API unaware of this issue. And major releases like No Mans Sky are being affected - so there really needs to be some documentation here Second - this is leaning me back towards going with only display scaling. I have a prototype of Uru that uses the panel's display modes - but on my M1 MacBook Pro it's full of modes that are including the "ears" area of the display which is not addressable and will just cause distortion. |
This PR is a draft and I'm open to all suggestions.
This pull request aims to introduce features into Plasma that make the engine more friendly to modern macOS. These features could also be adopted by Linux and Windows - but this PR doesn't make any attempt to add adoption for those platforms.
Window server display events
The first problem this PR attempts to solve is how Plasma should respond when the display resolution changes. The client will need to respond to window server events and pass in a new resolution.
Previously the engine did not support the client being able to set resolution. It would also reset all graphics settings if the environment changed to an extent the previous display mode was no longer available.
To solve this, Plasma now records the ratio of selected width to maximum width. This ratio can be used to decide a new resolution if the environment changes. When the environment or window size changes, a new resolution is picked that has the same ratio to the window size.
To allow the client to change resolutions, a new ini interface is available that gives the client access to the graphics ini. This is the only way currently that the Python and C++ halves of Plasma can exchange settings.
Virtual resolutions
Virtual resolutions in general have a few other benefits, and this pull request introduces a switch to turn off display mode validation. This lets the client set whatever resolution it wants.
Metal has it's own render buffer that's entirely distinct from the display mode. This render buffer can be set to any size.