-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow changing video resolution at runtime + more #538
base: port
Are you sure you want to change the base?
Conversation
109909f
to
9da5fcc
Compare
If you're looking for opinions from (admittedly fairly technically minded in my case) players, I think ideally the FPS-related options would work something like this:
|
There are some reasons why I haven't exposed the framerate cap in the options. Mostly two:
Of course we could just expose them as-is and let the user figure it out. As for the PR itself, you should probably avoid exposing Fast3D headers to the game code where possible. This can be achieved by providing thin wrappers for mode listing, e.g.: video.h
video.c
gfx_sdl2.cpp
There's no need to allocate memory for them either. Well, maybe if you want to cache them in video.c, but that shouldn't be necessary. |
It seems the framerate options are more involved than I had originally thought. Thank you both for your insights. I'll have more to say about that later, but, at the moment, I'm pretty exhausted. Hmm... I can't just use the raw output of EDIT: Okay. Nevermind... using a struct is a million times simpler and far less error prone to implement. Well, I could move The modes array is formatted like so: {"Custom", "3840x2160", "2560x1600", "2560x1440", "2048x1536", ...} I could alternatively use a fixed-size array, but, I don't want to end up causing someone with, like, 50 unique display modes to not have them all be available. Also, when it comes to selecting the resolution from the dropdown menu, do the strings really not need to be cached? |
Okay, well, then they should be all cached and deduplicated in video.c at startup. Basically what you do in gfx_sdl2.cpp, except in video.c. For example:
Index 0 could still serve as "custom", you just fill it in with whatever the current mode is and add special handling wherever.
|
Okay, I see. This solves the |
You only need to convert them to string when actually displaying them in the menu, because everywhere else in the code will just need the number. It just makes it easier and makes more sense to me, as you don't have to
I just want Fast3D to remain as encapsulated as possible because it is cleaner this way and theoretically makes changing renderers less painful. It makes more sense to me when the actual display mode retrieval is on the backend (gfx_sdl2), while video.c deals with filtering/sorting/giving them to the user. |
Okay, after implementing, for the most part, what you sent me, debugging and testing everything again to make sure things are still working correctly, evicting the string cache and its accomplice, I opted to keep the "Custom" message for unknown video resolutions because I feel that it works better when considering that just inserting whatever the current video mode is into index 0 can result in a duplicate entry or down-right confusion when a smaller value is above a list of descending values. "Custom", on the other hand, looks special because it is special. But, admittedly, it can be another source of confusion if someone is expecting to be able to enter a custom resolution once selected. |
Yes, I just provided the code as a janky example for how to manage the video mode list, this is all fine. I think we don't have to actually add a way to enter a custom resolution from the menu because someone can just edit the INI for that. Most people will probably play in borderless fullscreen anyway. |
Yeah, I'm not planning on incorporating that functionality. |
Okay, on the topic of framerate:
so, for example something like: if (g_TickRateDiv < 2) {
g_TickRateDiv = (vidFramerateLimit == 0 || vidFramerateLimit > 60) ? 0 : 1;
} the I imagine setting One thing I don't like, at the moment, is that setting either Oh, and about vsync: vsync is king. If it's enabled, then As for how I'd like things to be laid out in the video menu: I think it's best to just keep things simple. I was thinking of making the framerate option a slider, from 0 to, I suppose, 240. I'd disable the slider if vsync is enabled, just like I currently do for the resolution dropdown menu if So yeah. If I have a totally screwed up understanding of how this works, please let me know. 😂 I'm a bit skeptical of my own conclusion since it seems to contradict what has already been mentioned here about the framerate, particularly about |
Yes, it would result in less bugs if everything above 60 FPS was interpolated like some other ports do. Arguably "true" high FPS is like what the port has right now is better since there's no interpolation delay, but it does result in more issues. For example, since the game uses a 240Hz timer with integer values, Combat Boost will start breaking at over 120 FPS since it slows down time 2x. So either we need to fix this somehow or fall back to interpolation, which has its own issues. The reason you can set the tickrate divisor above 1 is "why not". Maybe you want to hard lock the game to 30FPS and see what happens. Something like what you're suggesting can be implemented, but you need to figure out how it will interact with vsync. There's no reliable way to determine your display's refresh rate (SDL's display info functions sometimes return bullshit and sometimes you have G-Sync, etc), so it's hard to determine what framerate the game will actually run at if vsync is enabled. If you rely on "Sync every N frames" is a somewhat confusing way to say "it will wait for N vblanks every SwapWindow call instead for 1 vblank". Meaning that if you set it to 2 and your display runs at 60Hz, the game will lock to 30FPS. If your monitor runs at 150Hz, the game will lock to 75FPS. Both Proper frame pacing is a very difficult problem, at least when you don't have swap chains. Thus all the options you can screw with, so you could possibly find a solution even on the most weird-ass setup. It is also probably a completely separate issue from this PR. |
Good points on monitor refresh rates being inconsistent and difficult to deal with, especially the variable refresh ones. Not sure how to deal with that, at the moment. I'll think of something eventually, I suppose. EDIT: Actually wait, now that I know vsync and the other caps aren't mutually exclusive: in the video menu, I'll just allow the player to set vsync and also set a framerate limit.
Oh yeah, I know what it means, but I was actually confused about how it was actually being implemented in the code. I couldn't find where it syncs every N frames. EDIT: Okay, it wasn't documented by SDL, but I did find the relevant info from Khronos about
For some reason, I thought they were mutually exclusive. After checking the code again, they indeed are not. There is a confusing comment about |
Fullscreen exclusive mode is togglable at runtime, as well, as a dropdown menu with "borderless" and "exclusive" options. All variables pertaining to video resolution, exclusive mode, and maximized state are now synchronized. For instance, Alt+Enter will affect the state of the fullscreen video option and maximizing the window via the title bar will update the maximize window option accordingly.
I was thinking about the layout of the Extended Video Options again and came up with this:
EDIT: Removed It is organized into five categories of descending "importance/popularity":
Another thing to note: this no longer fits on a single screen and needs to be scrolled. I wanted to try to keep it all visible without the need for scrolling. Oh well. At least most of the important options will be visible without scrolling. Alternatively, I suppose it would be possible to make it so the player needs to select from one of those five aforementioned categories first, but I don't want to do that since I feel like that would become a nuisance. Oh yes, and I need to make sure that some of these options aren't available if the hardware doesn't support them. EDIT: forgot to mention: |
Might be showing my idiocy here, but exposing the AllowHiDPI setting in the in-game UI seems like a bad idea. I think if you know what "Allow HiDPI" would mean and know enough that you want it to be on, you're probably enough of a poweruser to check the documentation or even go through the .ini file to see what options you can configure, and therefore don't need it to be part of the game UI. |
I put I could alternatively hide those options away under an actual |
Right, but there are people who will absolutely ignore that; whether because they overestimate themselves or simply cannot read. |
Hmm... It does seem to be something that is set when creating a window only. Proper implementation at runtime would then require that I destroy and create the window again, which is kind of a pain in the ass, so... I should probably drop it from the menu after all. |
Pretty sure changing HiDPI settings requires recreating the window because it's a window flag + hint, which we don't do. I'd leave it in the ini only. |
After implementing and testing Vsync options, it's become apparent that my proposal is inadequate. Sliders can be disabled but don't have the appearance of a disabled menu item. Moving on. I'm currently looking at the MP Limits menu for inspiration. It seems one can add labels to specific slider values. Might be interesting... |
8c5f627
to
34b9bbc
Compare
Okay, I got I implemented a new menu item flag: I imagine this feature could be useful for any other slider where one would want to defer its update. |
Also implements a new menu item flag: MENUITEMFLAG_SLIDER_DEFERRED, which only calls the MENUOP_GETSLIDER callback on entering dimmed mode, storing its value for later ticks, and only calls the MENUOP_SET callback on leaving dimmed mode. This was necessary to prevent the framerate limit slider from immediately affecting the framerate, leading to an annoying slow down as the value was decremented to 0.
34b9bbc
to
299f7ec
Compare
This seemed too easy. Was there a reason this was being deferred until level load? diff --git a/port/src/optionsmenu.c b/port/src/optionsmenu.c
index dfc12c97b..9e23290af 100644
--- a/port/src/optionsmenu.c
+++ b/port/src/optionsmenu.c
@@ -905,6 +905,7 @@ static MenuItemHandlerResult menuhandlerTexFilter2D(s32 operation, struct menuit
return videoGetTextureFilter2D();
case MENUOP_SET:
videoSetTextureFilter2D(data->checkbox.value);
+ g_TexFilter2D = videoGetTextureFilter2D() ? G_TF_BILERP : G_TF_POINT;
break;
}
|
NOTE: Only tested on Windows 10. I don't have access to other OSes, but hopefully it behaves the same way on them via SDL.
New config option:
Video.CenterWindow
I must admit that a centered window at native resolution, but not in fullscreen, results in what looks very much like fullscreen, at least on Windows 10. The taskbar disappears and everything. I don't really like that behavior and it feels misleading when Alt+Enter appears to do "nothing". I wonder how it behaves on other OSes?
EDIT: So, this means that having
Center Window
enabled andFull Screen
disabled at native resolution is essentially borderless windowed mode. On Windows 10, the desktop compositor is NOT bypassed in this configuration, but in actual fullscreen mode, it is bypassed (for both borderless and exclusive modes). Anyway, this was completely unintentionally implemented on my part, but some people might like this configuration more than the other ones.Implementation
I took the liberty to rearrange the more frequently-accessed / sought-after video options to the top of the list. Since video display modes vary per device, during
videoInit()
, I create a heap-allocated array of unique resolutions for every video mode, which are later cycled through in a dropdown menu. I made sure to free it at exit, even though that doesn't really matter too much. Since the window can be arbitrarily resized, I made sure to accommodate for that with the special "Custom" resolution, which basically just means that the current window resolution doesn't match any of the known resolutions in the array.Variables which store the state of fullscreen, fullscreen-exclusive, resolution, and maximize are now synchronized with the state of the window. They will be written to the ini as modified at runtime. I feel like they are more intuitive this way. For instance, if I change the resolution in-game, I'd like the resolution to be the same after I exit and launch the game again, rather than the ini being the sole place where modification is allowed. I should note that custom resolutions aren't written to the ini. Whatever the last known (as in, belonging to the array) resolution was is what will be written, in that case.
Caveats
Since it leverages the existing system in place for resizing the window, it carries the same caveats, though only one is really obvious: the blur framebuffer is blanked if the resolution changes vertically. Horizontal changes result in warping of that buffer. Cycling the pause menu on and off will take a new screenshot and restore the blur buffer. A warped blur buffer can only be fixed by blanking the blur buffer with a vertical resolution change, followed by pause menu cycling.
I was thinking of a more elegant solution to this problem, which would probably involve pushing a special menu mode for resolution changes, which, in reality, would not render any menu but instead show a clean view of the background for a screenshot to be taken, then reapply the blur buffer and pop said menu. I imagine the same solution can be used for arbitrary resizes (like dragging the window) while the pause menu is open.
Future Work
I have aspirations to make the rest of the ini-exclusive video options runtime-modifiable, in a way that is user-friendly. I'll have to think about how best to abstract the framerate-related options, since they're a bit technical as is. Having the ability to toggle the current FPS in-game would be nice, too.
As always, if you want me to change something, let me know.