-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] - Memory leaks during environment creation tasks #848
Comments
The conda store worker docker container usage rises by ~18Mb with each new env build according to |
Per @Adam-D-Lewis's comment above about seeing memory increases even after bypassing the initially flagged action, we need to do a more in-depth profiling analysis. |
action_add_conda_prefix_packages
leaks memory
I tried restricting the docker container to 1 GiB of memory usage, and the memory growth per build was then less than 18Mb per build, but did increase, and eventually the celery worker was restarted due to memory usage. I'm wondering if the excess 11Mb of memory growth in the docker container when I had no memory limit specified were just due to python being a memory managed language and that the memory usage from garbage collected objects is not always given back to the OS. That being said, I think it should still be possible to write a test showing the memory usage grows to the point where a celery worker runs out of memory. |
@peytondmurray is this still relevant. IIRC we could not demonstrate that there was in fact a leak happening but we will be doing some profiling soon too. |
That sounds fine; closing now. Anyway, even if there was a leak we are now restarting workers regularly so the symptom is in principle avoided even if any underlying leak (if any) isn't solved. |
I'm running conda-store-ui locally with a fresh build. My creation of an env with |
Are you running this in Docker or standalone? |
If you're using |
I was using I was trying to look at the logs but I dont know what all the output means. For example, it seems like a "bulk save" task is rather time consuming and unrelated to any env build I'm doing, but I don't know what that is or what it does. |
Describe the bug
See nebari-dev/nebari#2418 and #840 for context. TLDR:
action_add_conda_prefix_packages
is leaking memory, causing problems on various nebari deployments.edit by @trallard: It seems
action_add_conda_prefix_packages
is not the main or at least not the sole culprit of memory leaks so I adjusted the titleMemray flamegraph:
Expected behavior
No memory leaks.
How to Reproduce the problem?
See nebari-dev/nebari#2418 for a description.
Output
No response
Versions and dependencies used.
No response
Anything else?
No response
The text was updated successfully, but these errors were encountered: