Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hardware Memory Usage Query #2447

Open
cwfitzgerald opened this issue Feb 2, 2022 · 2 comments
Open

Hardware Memory Usage Query #2447

cwfitzgerald opened this issue Feb 2, 2022 · 2 comments
Labels
area: performance How fast things go help required We need community help to make this happen. type: enhancement New feature or request

Comments

@cwfitzgerald
Copy link
Member

cwfitzgerald commented Feb 2, 2022

Is your feature request related to a problem? Please describe.

Users often need to know how much memory they can actually use to make good use of the data they have, without needing to do it on a reactionary basis.

Describe the solution you'd like

Add an api that dispatches to the following apis:

We can also, barring information about real memory usage, expose the "expected heap sizes" which are:

I think we should just expose both if we can with a simple

pub struct MemInfo {
     pub used: u64;
     pub max: u64;
}    

pub struct MemoryUsage {
    pub actual: Option<MemInfo>
    pub expected: Option<MemInfo>
}

Describe alternatives you've considered

Not providing this information. It works in a pinch, but in practice, this isn't great as applications that need to store datasets well above what memory supports needs some way to understand the current memory pressure.

Additional context

BVE-Reborn/rend3#348

@cwfitzgerald cwfitzgerald added type: enhancement New feature or request help required We need community help to make this happen. area: performance How fast things go labels Feb 2, 2022
@pixelcluster
Copy link

pixelcluster commented Feb 4, 2022

FWIW, GL has GL_NVX_gpu_memory_info which seems to be supported okay-ish (~50% of reports on gpuinfo).

GPU_MEMORY_INFO_TOTAL_AVAILABLE_MEMORY_NVX would then be maximum/"expected" memory size, and GPU_MEMORY_INFO_CURRENT_AVAILABLE_VIDMEM_NVX would be the actual available value.

@cwfitzgerald
Copy link
Member Author

I'm not positive if that will show up as we use GLES, but if it does we totally should. Thanks for the pointer!

@cwfitzgerald cwfitzgerald added this to the Release 0.14 milestone Aug 4, 2022
@cwfitzgerald cwfitzgerald removed this from the Release 0.15 milestone Jan 26, 2023
cwfitzgerald pushed a commit that referenced this issue Oct 25, 2023
This avoids breaking the build with the latest release of dxc, which
made HLSL the default.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: performance How fast things go help required We need community help to make this happen. type: enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants