-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache management facilities #1035
Comments
@cgils, are you talking about project "cache" (it is not really a cache, it is just your output directory) or stuff that gets into ~/.cache/bazel? |
Stuff that gets put into |
I vote for this to be higher priority :). Bazel is commonly filling up hard drives on our project from the |
+1 |
1 similar comment
+1 |
+1, esp. if the home dir is on NFS. |
+1 |
4 similar comments
+1 |
+1 |
+1 |
+1 |
@jin do you have an idea to which component to assign this issue? |
@buchgr team-Local-Exec sounds like a good fit. |
There are several kinds of data to be managed here:
Separately, having a command to dump all files Bazel knows about, grouped by category/project and with a summary of their size, would be awesome. Tag each entry with an ID so you can tell |
+1 |
Per #2765, one more thing to consider if implementing a smarter clean command is |
+1 |
1 similar comment
+1 |
It had been some time since 2016, it'd be great for this to get higher priority. To give some context, I'm on 2TB drive and have to keep fighting with Bazel for living space. My ~/.cache/bazel is consistently eating up to 0.7TB of free space (in 2-3 weeks). I have to purge it then. And that results in multi-GB downloads (CUDA, etc). It'd be great to add a limit and an LRU-type of behavior to the cache. Alternative - remove "{Fast, Correct} - Choose two" advertisement from the web page. Eating all the space on the disk is not correct. Making a user to purge cache completely is not fast. |
This issue should really be P1, not P2. It is extremely bad that on my system there have been nearly a million files generated in the On many personal computers, people might just not realize this is happening, and have their entire filesystem slowed down due to the additional effort needed by their system to index everything. Then, on some shared computers where there are hard file limit constraints, bazel is effectively unusable because it generates so many files. (This is the only reason I noticed this in the first place – otherwise I would have just had a slower filesystem without knowing why) |
+1 |
1 similar comment
+1 |
+1 |
Please consider adding some way of setting a maximum cache size. At the moment our project's build caches grow unbounded leading to us discarding them periodically.
A facility with which to display cache information would be nice as well. Something like the output of
ccache -s
, for example:The text was updated successfully, but these errors were encountered: