-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issues with backup preparation times #78
Comments
Hide the stall. Keep track of the time to hash the last few times take an average, and use that to display a progress while hashing. 😎 |
How about, we combine the two methods! If file modification date didn't change, check the file hash, if it did change, it means the file was for sure modified, so we can skip calculating hash and just back it up 😄 |
For MCA files, which I'd assume are the bulk of the processing load, we could probably use the chunk update timestamps in the header. However, I'd strongly recommend adding a config option to disable 'smartness' like this, just in case a server admin is aware of anything that might screw with these indicators...
As far as I can tell we wouldn't even need to estimate, since we know (1) how many files need to be hashed, (2) how big they are and (3) how many bytes of the current file have been processed so far. |
Okay, so here's the deal :
In order to calculate which files to backup in an incremental / differential partial, we need to make a hash of each file.
This is... slow, because it effectively means reading the entire world from disk as if we were making a full backup, before we even start making one. We then have to re-read the files we want to backup from disk when the backup actually starts.
On larger worlds, this can lead to a long time sat on the
Backup Starting
message before any progress updates are sent, which can make a backup look stalled even when it wasn't.It also extends the backup time pretty significantly, of course, because you're reading the entire world just to figure out what to backup.
Problem is... how can one solve this?
The text was updated successfully, but these errors were encountered: