Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory consumption when repeatedly requesting recommendations #383

Open
mhrmsn opened this issue Sep 27, 2024 · 4 comments
Open

Memory consumption when repeatedly requesting recommendations #383

mhrmsn opened this issue Sep 27, 2024 · 4 comments
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@mhrmsn
Copy link

mhrmsn commented Sep 27, 2024

Here's a small example of a script that obtains recommendations from baybe repeatedly. For some reason, when just using the recommender repeatedly, the memory consumption increases (profiled using memray).

python memory_test.py 1, will allocate roughly ~ 0.9 GB to recommender.recommend()

python memory_test.py 20, will allocate roughly ~ 28.5 GB to recommender.recommend()

Is it possible to reduce the memory consumption here? It's not obvious why (seemingly) some objects created during the recommendation seem to persist, when I'm only interested in the final recommendations. I also tried playing around with recreating the recommender object in the loop or using del recommender after .recommend(), but this doesn't seem to improve the memory consumption.

memory_test.py

import sys
import pandas as pd
import numpy as np

from baybe.parameters import NumericalContinuousParameter
from baybe.searchspace.continuous import SubspaceContinuous
from baybe.recommenders import BotorchRecommender
from baybe.objectives import DesirabilityObjective
from baybe.targets import NumericalTarget
from baybe.acquisition import qNIPV

# Searchspace & Parameters
searchspace = SubspaceContinuous(
    (NumericalContinuousParameter(name=f"param_{i}", bounds=(0, 1)) for i in range(10))
).to_searchspace()

# Objective
objective = DesirabilityObjective(
    (NumericalTarget(name=f"target_{i}", mode="MAX", bounds=(0, 1), transformation="LINEAR") for i in range(3))
)

# Recommender
recommender = BotorchRecommender(
    acquisition_function=qNIPV(100)
)

# Measurements
measurements = pd.DataFrame(
    np.random.uniform(0, 1, (100, 13)),
    columns=[f"param_{i}" for i in range(10)] + [f"target_{i}" for i in range(3)]
)

for _ in range(int(sys.argv[1])):
    recommendations = recommender.recommend(
        10,
        searchspace,
        objective,
        measurements
    )
@Scienfitz
Copy link
Collaborator

what about gc.collect() with / without the del ?

@Scienfitz
Copy link
Collaborator

Scienfitz commented Sep 27, 2024

I remember long time ago weve had some strange memory observations before @AdrianSosic but they seemed on torch side and would just cause more or less keeping of thigns in memory but they were freed if the memorylimits were about to be reached (ie in effect no memory problem)

A quick way of investigating this suspicion would be to crank up the number of iterations to 1000 or so and observe whether that script actually crashes the machine or not.

@mhrmsn
Copy link
Author

mhrmsn commented Sep 27, 2024

what about gc.collect() with / without the del ?

I've tested that and while it alters the memory consumption, it doesn't seem to change the general issue.

@mhrmsn
Copy link
Author

mhrmsn commented Sep 27, 2024

A quick way of investigating this suspicion would be to crank up the number of iterations to 1000 or so and observe whether that script actually crashes the machine or not.

This will result in the process being killed after it consumes too much memory:

> python memory_test.py 500
[1]    32660 killed     python memory_test.py 500
/Users/mhrmsn/mambaforge/envs/baybe/lib/python3.12/multiprocessing/resource_tracker.py:254: UserWarning: resource_tracker: There appear to be 1 leaked semaphore objects to clean up at shutdown
  warnings.warn('resource_tracker: There appear to be %d '

@Scienfitz Scienfitz added bug Something isn't working help wanted Extra attention is needed labels Sep 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants