You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Here's a small example of a script that obtains recommendations from baybe repeatedly. For some reason, when just using the recommender repeatedly, the memory consumption increases (profiled using memray).
python memory_test.py 1, will allocate roughly ~ 0.9 GB to recommender.recommend()
python memory_test.py 20, will allocate roughly ~ 28.5 GB to recommender.recommend()
Is it possible to reduce the memory consumption here? It's not obvious why (seemingly) some objects created during the recommendation seem to persist, when I'm only interested in the final recommendations. I also tried playing around with recreating the recommender object in the loop or using del recommender after .recommend(), but this doesn't seem to improve the memory consumption.
I remember long time ago weve had some strange memory observations before @AdrianSosic but they seemed on torch side and would just cause more or less keeping of thigns in memory but they were freed if the memorylimits were about to be reached (ie in effect no memory problem)
A quick way of investigating this suspicion would be to crank up the number of iterations to 1000 or so and observe whether that script actually crashes the machine or not.
A quick way of investigating this suspicion would be to crank up the number of iterations to 1000 or so and observe whether that script actually crashes the machine or not.
This will result in the process being killed after it consumes too much memory:
> python memory_test.py 500
[1] 32660 killed python memory_test.py 500
/Users/mhrmsn/mambaforge/envs/baybe/lib/python3.12/multiprocessing/resource_tracker.py:254: UserWarning: resource_tracker: There appear to be 1 leaked semaphore objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d '
Here's a small example of a script that obtains recommendations from
baybe
repeatedly. For some reason, when just using the recommender repeatedly, the memory consumption increases (profiled using memray).python memory_test.py 1
, will allocate roughly ~ 0.9 GB torecommender.recommend()
python memory_test.py 20
, will allocate roughly ~ 28.5 GB torecommender.recommend()
Is it possible to reduce the memory consumption here? It's not obvious why (seemingly) some objects created during the recommendation seem to persist, when I'm only interested in the final recommendations. I also tried playing around with recreating the recommender object in the loop or using
del recommender
after.recommend()
, but this doesn't seem to improve the memory consumption.memory_test.py
The text was updated successfully, but these errors were encountered: