-
Notifications
You must be signed in to change notification settings - Fork 24.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Frequent OutOfMemoryErrors in Builds #41061
Comments
Pinging @elastic/es-core-infra |
This one looks slightly different but I'll add it here to get a better overview of things: https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+7.x+artifactory/283/console
|
I suspect that this is the Gradle client module that is running out of memory here no the daemon, because there's no demon crashed message. |
The Gradle Daemon actually running the build does so with a 2GB heap. This PT changes the heap configuration of the gradle process that talks with the daemon to trigger the builds and relies the messages. Relates to elastic#41061
Two more:
|
Looking at some failures after #41031 were merged and seems no heapdump was produced which seems to confirm this is a problem on the client. |
We see another instance of this in https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+7.x+multijob-unix-compatibility/os=centos-7&&immutable/113/console It fails with:
Inspecting the resulting heap dump we see almost all of the memory (124 out of 128MB) taken up by instances of |
There is also another instance of this failure in https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+7.x+multijob-unix-compatibility/os=oraclelinux-6/113/console. The assessment is identical to the failure above hence I'll omit the details. |
We are no longer using these dependencies. Relates to elastic#41061 since the class that seems to be leaking is both part of Gradle and the logging jar.
Certainly the way that test runners ship log output to the client has changed. It's not surprising the Gradle native daemon->client communication creates more garbage. I'm wondering if the recent change to remove |
We are no longer using these dependencies. Relates to #41061 since the class that seems to be leaking is both part of Gradle and the logging jar.
We are no longer using these dependencies. Relates to #41061 since the class that seems to be leaking is both part of Gradle and the logging jar.
Looking at build logs it looks like this stopped happening after the cleanup. |
We are no longer using these dependencies. Relates to elastic#41061 since the class that seems to be leaking is both part of Gradle and the logging jar.
There seem to be a few instances of builds endings with OutOfMemoryErrors lately.
Some recent ones:
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+7.x+artifactory/273/console
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+master+intake/3057/console
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+master+matrix-java-periodic/ES_BUILD_JAVA=openjdk12,ES_RUNTIME_JAVA=openjdk12,nodes=immutable&&linux&&docker/350/console
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+7.x+matrix-java-periodic/ES_BUILD_JAVA=openjdk12,ES_RUNTIME_JAVA=zulu8,nodes=immutable&&linux&&docker/126/console
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+7.x+multijob-unix-compatibility/os=ubuntu-14.04&&immutable/99/console
Builds end with something like:
The text was updated successfully, but these errors were encountered: