-
Notifications
You must be signed in to change notification settings - Fork 29.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory usage remains high even after destroying sockets #3978
Comments
@ChALkeR I see you've added invalid label to this issue. May I know why? |
Several reasons for that.
|
@ayeressian I was still writing a comment at that moment =). |
|
You should probably remove the sockets from the |
@targos The reference to sockets array gets lost after setTimeout so GC should remove it. You can test it yourself it will not effect memory usage. |
Do you have a better testcase for that? For example, if looping the above testcase for several times (with gc runs) would raise the memory usage over the limit (resulting in a crash) — this would be an issue. |
I don't see how it can get lost. It's a global variable. Anyway it's just a suggestion, and I cannot test it myself. The client code of the testcase you provided fails to run on my computer. |
@targos what is your node version? |
v5.1.0
|
@targos you need to increase the ulimit of your system. Or you can decrease the number of sockets. |
@targos every open socket in application requires at least one open file in system. By default the number of open files that are allowed per process in OS is less than 15000. So you need to increase that limit by ulimit command. I assume you are using unix or linux. |
@ChALkeR I will try to improve the test case |
@ayeressian The results (I also altered the logging a bit, it now logs at the start and at the end):
I do not see a leak. |
@ChALkeR you are right. Thanks for your help. |
@ayeressian As for the memory allocators (number 3 in #3978 (comment)), check out this example (C++): #include <string>
#include <vector>
#include <cstdio>
#include <iostream>
#include <malloc.h>
using namespace std;
vector<string> *x, *y;
int main() {
x = new vector<string>();
y = new vector<string>();
for (int i = 0; i < 5 * 1024 * 1024; i++) {
x->push_back(to_string(i) + string(" test"));
}
y->push_back(to_string(0) + string(" test"));
x->clear();
delete x;
cout << "ready" << endl;
getchar();
y->clear();
delete y;
return 0;
} It would not give the memory consumed by |
@ChALkeR hmm... interesting. I think this behaviour should be dependent on OS memory management algorithm. |
I've created a simple nodejs server and client. They interact with each other via 15000 tcp sockets.
client code:
server code:
They send and receive messages. During creation of sockets the memory usage gets high. But after destroying the sockets I expect the memory usage to get low, which doesn't happen.
Stackoverflow url
The text was updated successfully, but these errors were encountered: