Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to load nodes due to too much memory consumption. #239

Open
mwantia opened this issue Mar 16, 2024 · 4 comments
Open

Unable to load nodes due to too much memory consumption. #239

mwantia opened this issue Mar 16, 2024 · 4 comments

Comments

@mwantia
Copy link

mwantia commented Mar 16, 2024

I am currently trying to create a mega client within Docker, but am unable to complete the execution of this method even after waiting for an hour.

public void CacheNodes()
{
    Console.WriteLine("Caching nodes...");
    lock (_lock)
    {
        IsRunning = true;
        if (_client.IsLoggedIn)
        {
            _nodes = _client.GetNodes();
            if ((Root = _nodes.Single(node => node.Type == NodeType.Root)) == null)
            {
                IsCompleted = false;
            }
            else
            {
                IsCompleted = true;
            }
        }
        IsRunning = false;
    }
    Console.WriteLine("Caching nodes completed.");
}

It works when I test everything on my PC, but only because Visual Studio is showing a memory usage of 5,2 GB under Diagnostic Tools.
This is obviously not a amount that I can just provide to an docker container which only job is to download files.

Are there any ways to avoid this kind of memory consumption?
Is there a reason why the amount is so high, even tho it only loads the metadata for each node?

@gpailler
Copy link
Owner

How many files/folders do you have on the Mega account?

@mwantia
Copy link
Author

mwantia commented Mar 23, 2024

I guess the dashboard should display the correct amount of data (files/folder) for the account I am using:
image
I wouldn't say that this is an extreme amount of data, so I don't know if this is simply a limitation from this project or if there's somehow a way to dump the list of nodes outside and only load part of it when required.

Also, this is the amount of node items I get after executing Client.GetNodes():
image

@gpailler
Copy link
Owner

Well, it's definitely a lot of files/folders and I never tried this library with that huge number of nodes. What is quite strange is that the number of nodes doesn't match the number of files+folders at all.
At the time of the implementation, it was only possible to dump all the nodes from the Mega API. Maybe there is another endpoint today to retrieve only a subset of the nodes, but I didn't investigate.
Would it be possible to check why you have ~1.5m nodes retrieved for only ~230k files?

@mwantia
Copy link
Author

mwantia commented Mar 26, 2024

I will see, if I can export all nodes into a big csv-file.
Maybe I can identify if there are broken nodes or something.
Could explain why there are so many and why it takes minutes to load them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants