Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow changing video resolution at runtime + more #538

Draft
wants to merge 6 commits into
base: port
Choose a base branch
from

Conversation

ZenithMDC
Copy link

@ZenithMDC ZenithMDC commented Nov 21, 2024

evo

NOTE: Only tested on Windows 10. I don't have access to other OSes, but hopefully it behaves the same way on them via SDL.

New config option: Video.CenterWindow
I must admit that a centered window at native resolution, but not in fullscreen, results in what looks very much like fullscreen, at least on Windows 10. The taskbar disappears and everything. I don't really like that behavior and it feels misleading when Alt+Enter appears to do "nothing". I wonder how it behaves on other OSes?

EDIT: So, this means that having Center Window enabled and Full Screen disabled at native resolution is essentially borderless windowed mode. On Windows 10, the desktop compositor is NOT bypassed in this configuration, but in actual fullscreen mode, it is bypassed (for both borderless and exclusive modes). Anyway, this was completely unintentionally implemented on my part, but some people might like this configuration more than the other ones.

Implementation

I took the liberty to rearrange the more frequently-accessed / sought-after video options to the top of the list. Since video display modes vary per device, during videoInit(), I create a heap-allocated array of unique resolutions for every video mode, which are later cycled through in a dropdown menu. I made sure to free it at exit, even though that doesn't really matter too much. Since the window can be arbitrarily resized, I made sure to accommodate for that with the special "Custom" resolution, which basically just means that the current window resolution doesn't match any of the known resolutions in the array.

Variables which store the state of fullscreen, fullscreen-exclusive, resolution, and maximize are now synchronized with the state of the window. They will be written to the ini as modified at runtime. I feel like they are more intuitive this way. For instance, if I change the resolution in-game, I'd like the resolution to be the same after I exit and launch the game again, rather than the ini being the sole place where modification is allowed. I should note that custom resolutions aren't written to the ini. Whatever the last known (as in, belonging to the array) resolution was is what will be written, in that case.

Caveats

Since it leverages the existing system in place for resizing the window, it carries the same caveats, though only one is really obvious: the blur framebuffer is blanked if the resolution changes vertically. Horizontal changes result in warping of that buffer. Cycling the pause menu on and off will take a new screenshot and restore the blur buffer. A warped blur buffer can only be fixed by blanking the blur buffer with a vertical resolution change, followed by pause menu cycling.

I was thinking of a more elegant solution to this problem, which would probably involve pushing a special menu mode for resolution changes, which, in reality, would not render any menu but instead show a clean view of the background for a screenshot to be taken, then reapply the blur buffer and pop said menu. I imagine the same solution can be used for arbitrary resizes (like dragging the window) while the pause menu is open.

Future Work

I have aspirations to make the rest of the ini-exclusive video options runtime-modifiable, in a way that is user-friendly. I'll have to think about how best to abstract the framerate-related options, since they're a bit technical as is. Having the ability to toggle the current FPS in-game would be nice, too.

As always, if you want me to change something, let me know.

@LonelySpaceDetective
Copy link

LonelySpaceDetective commented Nov 22, 2024

If you're looking for opinions from (admittedly fairly technically minded in my case) players, I think ideally the FPS-related options would work something like this:

  • The tickrate divisor I think within the UI should be relabeled as the general FPS cap, even if that's not quite what the option is. Options 1 and so on could be relabeled simply as the framerate they result in (probably capping at 3 aka 20 FPS or 4 aka 15 FPS), while 0 I would label as "Advanced", and if set exposes the following option.
  • The actual framerate limit option should only appear if the tickrate divisor is set to 0 (or at least be greyed out until then), and default to 60 rather than its current 200. If the user tries to input a limit above 60, there should probably be a warning that things could break and give them a chance to reconsider. If the tickrate divisor aka above option is altered, the framerate limiter should reset to 60 to avoid interference (e,g if the user set a sub-60 limit here, not resetting means they can set the above option to 60 and not get 60).
    • Maybe there could also be an option for truly uncapped FPS as well; currently if you try to set the FPS cap to 0 (to disable it) or above 240, it resets to 240. Probably a request for the port maintainer rather than you, though.
  • VSync would just be exposed as-is with appropriate labels, though showing all the divisions you could do might be overkill. Maybe cap it at 1/4th refresh rate? Another problem might be people not understanding that vsync can override or be overridden by the FPS cap, but I'm not sure how you would address that within PD's UI constraints.

@fgsfdsfgs
Copy link
Owner

fgsfdsfgs commented Nov 22, 2024

There are some reasons why I haven't exposed the framerate cap in the options. Mostly two:

  • The way it's implemented might be confusing for the end user ("what the fuck is a tickrate divisor?"), but that is also kind of necessary to support all possible combinations of vsync/fps cap/etc, because various setups can have different frame pacing issues. This necessitates some sort of unified option that sets both the divisor and the framerate limit, but that can also be difficult to get right (e.g. what should happen when vsync is on?).
  • High FPS modes are still buggy and I'd actually not recommend playing the game at over 120 FPS.

Of course we could just expose them as-is and let the user figure it out.

As for the PR itself, you should probably avoid exposing Fast3D headers to the game code where possible. This can be achieved by providing thin wrappers for mode listing, e.g.:

video.h

typedef struct {
	s32 width;
	s32 height;
} videomode;

s32 videoGetNumVideoModes(void);
s32 videoGetVideoMode(videomode *out, const s32 index);

video.c

s32 videoGetNumVideoModes(void)
{
	return wmAPI->get_num_video_modes();
}

s32 videoGetVideoMode(videomode *out, const s32 index)
{
	return wmAPI->get_video_mode(index, &out->width, &out->height);
}

gfx_sdl2.cpp

int gfx_sdl_get_num_video_modes(void) {
    return SDL_GetNumDisplayModes(0);
}

int gfx_sdl_get_video_mode(int modenum, int *out_w, int *out_h) {
    SDL_DisplayMode sdlmode;
    if (SDL_GetDisplayMode(0, modenum, &sdlmode)) {
        *out_w = sdlmode.w;
        *out_h = sdlmode.h;
        return 1;
    }
    return 0;
}

There's no need to allocate memory for them either. Well, maybe if you want to cache them in video.c, but that shouldn't be necessary.

@ZenithMDC
Copy link
Author

ZenithMDC commented Nov 22, 2024

It seems the framerate options are more involved than I had originally thought. Thank you both for your insights. I'll have more to say about that later, but, at the moment, I'm pretty exhausted.

Hmm... I can't just use the raw output of SDL_GetDisplayMode() in conjunction with SDL_GetNumDisplayModes(), since that results in "duplicates", or really, modes with different refresh rates but the same resolutions. That gfx_modes.h file was one that I added because I needed to expose the struct within it to both the Fast3D code and the game code. Although, now that I think about it, did it even need to be a struct at all? It really behaves like an array of char arrays of size 10. Hmm... well, this is a bit embarrassing. Anyway, that should be an easy change and then I don't need gfx_modes.h at all.

EDIT: Okay. Nevermind... using a struct is a million times simpler and far less error prone to implement. Well, I could move gfx_modes.h to wherever you feel it should be, since It's not actually a part of Fast3D.

The modes array is formatted like so:

{"Custom", "3840x2160", "2560x1600", "2560x1440", "2048x1536", ...}

I could alternatively use a fixed-size array, but, I don't want to end up causing someone with, like, 50 unique display modes to not have them all be available. Also, when it comes to selecting the resolution from the dropdown menu, do the strings really not need to be cached?

@fgsfdsfgs
Copy link
Owner

fgsfdsfgs commented Nov 22, 2024

Okay, well, then they should be all cached and deduplicated in video.c at startup. Basically what you do in gfx_sdl2.cpp, except in video.c. For example:

static displaymode vidModeDefault;
static s32 vidNumModes = 1;
static displaymode *vidModes = &vidModeDefault;

static s32 videoInitDisplayModes(void)
{
	if (!wmAPI->get_current_display_mode(&vidModeDefault.width, &vidModeDefault.height)) {
		vidModeDefault.width = 640;
		vidModeDefault.height = 480;
		return false;
	}
	
	const s32 numBaseModes = wmAPI->get_num_display_modes();
	if (!numBaseModes) {
		return false;
	}
	
	displaymode *modeList = sysMemZeroAlloc(numBaseModes * sizeof(displaymode));
	if (!modeList) {
		return false;
	}

	s32 w, h;
	if (!wmAPI->get_display_mode(0, &w, &h)) {
		return false;
	}
	
	s32 numModes = 1;
	modeList[0].width = w;
	modeList[0].height = h;
	
	// SDL modes are guaranteed to be sorted high to low
	for (s32 i = 1; i < numBaseModes; ++i) {
		s32 neww, newh;
		if (!wmAPI->get_display_mode(i, &neww, &newh)) {
			continue;
		}
		
		if (neww != w || newh != h) {
			neww = w;
			newh = h;
			modeList[numModes].width = w;
			modeList[numModes].height = h;
			++numModes;
		}
	}
	
	modeList = sysMemRealloc(modeList, numModes * sizeof(displaymode));
	if (!modeList) {
		return false;
	}
	
	vidModes = modeList;
	vidNumModes = numModes;
	
	return true;
}

s32 videoInit(void)
{
	...
	videoInitDisplayModes();
	initDone = true;
	return 0;
}

s32 videoGetNumDisplayModes(void)
{
	return vidNumModes;
}

s32 videoGetDisplayMode(displaymode *out, const s32 index)
{
	if (index >= 0 && index < vidNumModes) {
		*out = vidModes[index];
		return true;
	}
	return false;
}

Index 0 could still serve as "custom", you just fill it in with whatever the current mode is and add special handling wherever.
You also definitely don't need to store them as strings and constantly convert back and forth:

static MenuItemHandlerResult menuhandlerResolution(s32 operation, struct menuitem *item, union handlerdata *data)
{
	static char resstring[32];
	displaymode mode;
	
	switch (operation) {
	...
	case MENUOP_GETOPTIONTEXT:
		videoGetDisplayMode(&mode, data->dropdown.value);
		snprintf(resstring, sizeof(resstring), "%dx%d", mode.width, mode.height);
		return (intptr_t)resstring;
	...

@ZenithMDC
Copy link
Author

Okay, I see. This solves the gfx_modes.h issue entirely. That is what the issue is about and I'm not just obsessing over gfx_modes.h like a crazed individual, I hope? 😂 About the conversion: doesn't that have to happen at some point, regardless? Or maybe I'm missing something? But I like your solution more overall. It's much cleaner, anyway. I'll take a closer look tomorrow.

@fgsfdsfgs
Copy link
Owner

fgsfdsfgs commented Nov 22, 2024

About the conversion: doesn't that have to happen at some point, regardless?

You only need to convert them to string when actually displaying them in the menu, because everywhere else in the code will just need the number. It just makes it easier and makes more sense to me, as you don't have to sscanf()/strol() shit just to get the actual width and height. Not to mention it stores less data.

That is what the issue is about and I'm not just obsessing over gfx_modes.h like a crazed individual, I hope?

I just want Fast3D to remain as encapsulated as possible because it is cleaner this way and theoretically makes changing renderers less painful. It makes more sense to me when the actual display mode retrieval is on the backend (gfx_sdl2), while video.c deals with filtering/sorting/giving them to the user.

@ZenithMDC
Copy link
Author

Okay, after implementing, for the most part, what you sent me, debugging and testing everything again to make sure things are still working correctly, evicting the string cache and its accomplice, strtol(), and making sure things are encapsulated as they should be, we are back to, basically — where we started. I've made some changes, too, like that gfx_sdl2.cpp wrapper you provided not accounting for the fact that SDL returns 0 as successful, among other things.

I opted to keep the "Custom" message for unknown video resolutions because I feel that it works better when considering that just inserting whatever the current video mode is into index 0 can result in a duplicate entry or down-right confusion when a smaller value is above a list of descending values. "Custom", on the other hand, looks special because it is special. But, admittedly, it can be another source of confusion if someone is expecting to be able to enter a custom resolution once selected.

@fgsfdsfgs
Copy link
Owner

the fact that SDL returns 0 as successful, among other things.

I opted to keep the "Custom" message for unknown video resolutions

Yes, I just provided the code as a janky example for how to manage the video mode list, this is all fine.

I think we don't have to actually add a way to enter a custom resolution from the menu because someone can just edit the INI for that. Most people will probably play in borderless fullscreen anyway.

@ZenithMDC
Copy link
Author

I think we don't have to actually add a way to enter a custom resolution from the menu because someone can just edit the INI for that. Most people will probably play in borderless fullscreen anyway.

Yeah, I'm not planning on incorporating that functionality.

@ZenithMDC
Copy link
Author

Okay, on the topic of framerate:

vidFramerateLimit affects the video backend's framerate whereas g_TickRateDiv affects game's tickrate (duh lol), or, in other words, the interval between actions in the game. The N64 version of the game always has g_TickRateDiv set to 1, or really, the variable it is an alias for, g_Vars.mininc60, so I imagine it is best to leave it at 1 whenever vidFramerateLimit doesn't exceed 60 or is not disabled (0). If vidFramerateLimit is anything else, then g_TickRateDiv needs to be 0, because otherwise, it would prevent the framerate from exceeding 60.

so, for example something like:

if (g_TickRateDiv < 2) {
    g_TickRateDiv = (vidFramerateLimit == 0 || vidFramerateLimit > 60) ? 0 : 1;
}

the g_TickRateDiv < 2 check is there in case someone wants to set TickRateDivisor to a specific value in the ini.

I imagine setting g_TickRateDiv to anything other than 1 causes gameplay differences, though, and, to be honest, I'm not convinced of the usefulness of setting g_TickRateDiv above 1, not to mention it's really too coarse to be useful as a general frame limiter for the video menu. I was reading elsewhere (#343 (comment)) that there are goals to interpolate framerates above 60, in order to prevent bugs. And, once that is implemented, would it actually be better to just leave g_TickRateDiv at 1 always?

One thing I don't like, at the moment, is that setting either vidFramerateLimit too low or g_TickRateDiv too high results in audio buffer underruns and input drops. I'm not sure it's possible to emulate the slower framerate of the N64 version without these issues presenting themselves, as things currently are.

Oh, and about vsync: vsync is king. If it's enabled, then vidFramerateLimit and g_TickRateDiv are overridden. However, it's important to note that not everyone has a monitor that refreshes at 60Hz. Btw, I read in the Wiki that vsync set "2+ - sync every N frames." Is this actually true? I looked at the code and didn't see it implemented that way. The backend SDL_GL_SetSwapInterval() makes no mention of it, either, in its documentation.

As for how I'd like things to be laid out in the video menu: I think it's best to just keep things simple. I was thinking of making the framerate option a slider, from 0 to, I suppose, 240. I'd disable the slider if vsync is enabled, just like I currently do for the resolution dropdown menu if vidFullscreen is enabled and vidFullscreenExclusive is 0 (aka borderless). For the time being, if vsync is disabled and someone attempts to set the framerate slider to 0, it will automatically be set to 240, just as things are currently implemented. That might seem peculiar to them, but I don't want to circumvent a feature that keeps the game, at least, playable.

So yeah. If I have a totally screwed up understanding of how this works, please let me know. 😂 I'm a bit skeptical of my own conclusion since it seems to contradict what has already been mentioned here about the framerate, particularly about g_TickRateDiv. Is all, really.

@fgsfdsfgs
Copy link
Owner

fgsfdsfgs commented Nov 24, 2024

Yes, it would result in less bugs if everything above 60 FPS was interpolated like some other ports do. Arguably "true" high FPS is like what the port has right now is better since there's no interpolation delay, but it does result in more issues. For example, since the game uses a 240Hz timer with integer values, Combat Boost will start breaking at over 120 FPS since it slows down time 2x. So either we need to fix this somehow or fall back to interpolation, which has its own issues.

The reason you can set the tickrate divisor above 1 is "why not". Maybe you want to hard lock the game to 30FPS and see what happens.

Something like what you're suggesting can be implemented, but you need to figure out how it will interact with vsync. There's no reliable way to determine your display's refresh rate (SDL's display info functions sometimes return bullshit and sometimes you have G-Sync, etc), so it's hard to determine what framerate the game will actually run at if vsync is enabled. If you rely on vidFramerateLimit, this will imply that the port will always cap framerate on the CPU side even if vsync is on, which might be undesirable, as the extra sleeping can result in shit frame pacing on some setups.

"Sync every N frames" is a somewhat confusing way to say "it will wait for N vblanks every SwapWindow call instead for 1 vblank". Meaning that if you set it to 2 and your display runs at 60Hz, the game will lock to 30FPS. If your monitor runs at 150Hz, the game will lock to 75FPS.

Both vidFramerateLimit and vsync stack on top of the already existing tickrate system. They don't care about each other, other than in the very specific case of "both vsync and tickrate divisor are set to 0", in which case the game will auto-cap to 240FPS using vidFramerateLimit unless it's already set to something lower. Vsync does not "override" the other caps, it just happens at the same time. This means that you can enable vsync on your 240Hz monitor and still cap the framerate to 60 if you so desire.

Proper frame pacing is a very difficult problem, at least when you don't have swap chains. Thus all the options you can screw with, so you could possibly find a solution even on the most weird-ass setup. It is also probably a completely separate issue from this PR.

@ZenithMDC
Copy link
Author

ZenithMDC commented Nov 24, 2024

Good points on monitor refresh rates being inconsistent and difficult to deal with, especially the variable refresh ones. Not sure how to deal with that, at the moment. I'll think of something eventually, I suppose.

EDIT: Actually wait, now that I know vsync and the other caps aren't mutually exclusive: in the video menu, I'll just allow the player to set vsync and also set a framerate limit.

"Sync every N frames" is a somewhat confusing way to say "it will wait for N vblanks every SwapWindow call instead for 1 vblank". Meaning that if you set it to 2 and your display runs at 60Hz, the game will lock to 30FPS. If your monitor runs at 150Hz, the game will lock to 75FPS.

Oh yeah, I know what it means, but I was actually confused about how it was actually being implemented in the code. I couldn't find where it syncs every N frames.

EDIT: Okay, it wasn't documented by SDL, but I did find the relevant info from Khronos about wglSwapIntervalEXT(), which is what SDL calls behind the scenes: The parameter <interval> specifies the minimum number of video frames that are displayed before a buffer swap will occur.

Vsync does not "override" the other caps, it just happens at the same time.

For some reason, I thought they were mutually exclusive. After checking the code again, they indeed are not. There is a confusing comment about set_target_fps() being disabled because vsync is on, but, well, it's just a comment.

Fullscreen exclusive mode is togglable at runtime, as well, as a
dropdown menu with "borderless" and "exclusive" options. All variables
pertaining to video resolution, exclusive mode, and maximized state are
now synchronized. For instance, Alt+Enter will affect the state of the
fullscreen video option and maximizing the window via the title bar will
update the maximize window option accordingly.
@ZenithMDC
Copy link
Author

ZenithMDC commented Nov 25, 2024

I was thinking about the layout of the Extended Video Options again and came up with this:

Full Screen			[ ]
Full Screen Mode		<Borderless|Exclusive>
Resolution			<Custom|3840x2160|1920x1080|...>
Maximize Window			[ ]
Center Window			[ ]
Remember Window Position	[ ]
HUD Centering			<None|4:3|Wide>
-------------------------
Vsync				<Adaptive|Off|On (Sync Every Frame)|On (Sync Every 2 Frames)|...>
Framerate Limit			|0----------240|
Display FPS			[ ]
-------------------------
Anti-aliasing (MSAA)		<Off|2x|4x|8x|16x>
Texture Filtering		<Nearest|Bilinear|Three Point>
GUI Texture Filtering		[ ]
-------------------------
GE64-style Muzzle Flashes	[ ]
Explosion Shake			|0-----------20|
-------------------------
Framebuffer Effects		[ ]
Tickrate Divisor		|0-----------10|
-------------------------
Back

EDIT: Removed Allow HiDPI from Advanced category. See #538 (comment).
EDIT: No need for Vsync Interval anymore. Wrapped it all into Vsync.

It is organized into five categories of descending "importance/popularity":

  • Window/Positioning Options
  • Framerate Options
  • Filtering Options
  • Miscellaneous Options
  • Advanced Options

[ ] denotes a checkbox.
< > denotes a dropmenu.
| | denotes a slider.

Remember Window Position and Display FPS aren't implemented yet. I added Tickrate Divisor to the advanced category so people can actually set it to something other than 0 or 1 if they really want to, but it's now managed under the hood if set to either 0 or 1. It is technically possible to eliminate Full Screen Mode and make Full Screen a dropdown menu with its respective options, but with Off prepended to it. However, I feel that this would hurt its usability, especially borderless fullscreen mode, which disables Resolution if Full Screen is enabled. It might not be as obvious to people that they need to switch the mode to something other than Borderless to access Resolution again, compared to seeing a separate Full Screen checkbox option. There are some additional reasons for keeping these separate options, pertaining to windowed mode, as well as the fact that they (vidFullscreen and vidFullscreenExclusive) are separate variables in the code. Vsync Interval is similarly disabled if Vsync is not set to Interval, but doesn't carry the caveats of Full Screen.

Another thing to note: this no longer fits on a single screen and needs to be scrolled. I wanted to try to keep it all visible without the need for scrolling. Oh well. At least most of the important options will be visible without scrolling. Alternatively, I suppose it would be possible to make it so the player needs to select from one of those five aforementioned categories first, but I don't want to do that since I feel like that would become a nuisance.

Oh yes, and I need to make sure that some of these options aren't available if the hardware doesn't support them.

EDIT: forgot to mention: Framerate Limit max is actually set to VIDEO_MAX_FPS and not 240.

@LonelySpaceDetective
Copy link

LonelySpaceDetective commented Nov 25, 2024

Might be showing my idiocy here, but exposing the AllowHiDPI setting in the in-game UI seems like a bad idea.
Based on issues I've heard for other games on Windows, what is going to happen is people who have DPI scaling enabled in their OS but have forgotten or not realize what it is is going to turn that setting on without thinking, and launching the game later will be confused and probably angry at it suddenly being lower resolution.

I think if you know what "Allow HiDPI" would mean and know enough that you want it to be on, you're probably enough of a poweruser to check the documentation or even go through the .ini file to see what options you can configure, and therefore don't need it to be part of the game UI.
Honestly, I don't even know why that setting is a thing; it being "off" should be ideal for everyone as far as I know, but I can't say I'm an expert on DPI stuff.

@ZenithMDC
Copy link
Author

ZenithMDC commented Nov 25, 2024

I put Allow HiDPI in the Advanced category for a reason. Basically, anything under there shouldn't be tinkered with unless the player knows exactly what they're doing with those options. Of course, the concept of these categories is exactly that — just concepts. They aren't going to be called that in the actual game. So, hopefully, seeing Allow HiDPI next to Framebuffer Effects and Tickrate Divisor will keep the lay player away. But, idk.

I could alternatively hide those options away under an actual Advanced... menu.

@LonelySpaceDetective
Copy link

LonelySpaceDetective commented Nov 25, 2024

I put Allow HiDPI in the Advanced category for a reason. Basically, anything under there shouldn't be tinkered with unless the player knows exactly what they're doing with those options.

Right, but there are people who will absolutely ignore that; whether because they overestimate themselves or simply cannot read.
The reason HiDPI in particular strikes me as problematic, is that I can easily see someone thinking "oh that sounds good, I should turn it on", and then totally forgetting about it since it'd only be apparent after a relaunch.

@ZenithMDC
Copy link
Author

Hmm... It does seem to be something that is set when creating a window only. Proper implementation at runtime would then require that I destroy and create the window again, which is kind of a pain in the ass, so... I should probably drop it from the menu after all.

@fgsfdsfgs
Copy link
Owner

Pretty sure changing HiDPI settings requires recreating the window because it's a window flag + hint, which we don't do. I'd leave it in the ini only.

@ZenithMDC
Copy link
Author

After implementing and testing Vsync options, it's become apparent that my proposal is inadequate. Sliders can be disabled but don't have the appearance of a disabled menu item. Moving on. I'm currently looking at the MP Limits menu for inspiration. It seems one can add labels to specific slider values. Might be interesting...

@ZenithMDC
Copy link
Author

ZenithMDC commented Nov 26, 2024

evo2
evo3

Okay, I got Vsync and Framerate Limit implemented now. I decided to get rid of Vsync Interval. It was easier to just wrap everything into a single dropmenu item. The Vsync dropmenu label is kind of wide but I feel abbreviating it further would lose important info. Idk. I'm open to suggestions if it's unacceptably wide. There was an annoying issue implementing Framerate Limit, but I came up with what I think is a pretty good solution, detailed below:

I implemented a new menu item flag: MENUITEMFLAG_SLIDER_DEFERRED, which only calls the MENUOP_GETSLIDER callback on entering dimmed mode, storing its value for later ticks, and only calls the MENUOP_SET callback on leaving dimmed mode. This was necessary to prevent the framerate limit slider from immediately affecting the framerate, leading to an annoying slow down as the value was decremented to 0.

I imagine this feature could be useful for any other slider where one would want to defer its update.

@ZenithMDC ZenithMDC marked this pull request as draft November 26, 2024 11:23
Also implements a new menu item flag: MENUITEMFLAG_SLIDER_DEFERRED,
which only calls the MENUOP_GETSLIDER callback on entering dimmed mode,
storing its value for later ticks, and only calls the MENUOP_SET
callback on leaving dimmed mode. This was necessary to prevent the
framerate limit slider from immediately affecting the framerate, leading
to an annoying slow down as the value was decremented to 0.
@ZenithMDC
Copy link
Author

This seemed too easy. Was there a reason this was being deferred until level load?

diff --git a/port/src/optionsmenu.c b/port/src/optionsmenu.c
index dfc12c97b..9e23290af 100644
--- a/port/src/optionsmenu.c
+++ b/port/src/optionsmenu.c
@@ -905,6 +905,7 @@ static MenuItemHandlerResult menuhandlerTexFilter2D(s32 operation, struct menuit
                return videoGetTextureFilter2D();
        case MENUOP_SET:
                videoSetTextureFilter2D(data->checkbox.value);
+               g_TexFilter2D = videoGetTextureFilter2D() ? G_TF_BILERP : G_TF_POINT;
                break;
        }

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants