-
Notifications
You must be signed in to change notification settings - Fork 188
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
xdelta is not detecting large amount of data to patch #252
Comments
@latot If you have enough memory you can set source window size (-B) to source file size.
|
I have the similar issue problem.
the result patch is 1.6G!!!! Unbeliveble! who can tell me why? |
Hi @ivan386 , is nice to know that there is other method, no idea why i never get the mail with your answer, i think so...., so.., why xdelta doesn't detect this?, if the algorythm detect equal sections in the data, check if the strings can have more in common than the specific substring? Thx. |
Anyone else could confirm this? |
mm, I don't know if someone else want to try it too... |
Hi, i was patching a splited file of 1.6GB, i split it with
split --bytes=800MB file P
, then we have two files, one of 800 and other off 761, there all fine, then i run:That should easy to construct the new patches, but now the weird thing, the first delta, comparing the first file, and the same time is the first part of the file, the delta file size is 781MB, basically all the rest of the big file, all fine.
But with the second, the delta file is 1.5GB..., is 95% of the original file, while we have the 45% of the file in the source..., xdelta3 is not detecting the data.
Bye.
The text was updated successfully, but these errors were encountered: