-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Consider a .tmp extension for incomplete parts instead of .zpaq. This can improve compatibility with file sync applications. #143
Comments
You can try backup with -tmp |
please compile that version so I can test |
zpaqfranz upgrade -force |
looks good for now... |
With this patch, it is now safe to add to archives and run tests simultaneously. |
only if the are multipart |
Not sure if I should file as a new issue.
To reproduce:
That creates a file test-000001.tmp and then renames it to test-000001.zpaq at the end. Then I extract the index file:
And then I run the backup again, but this time using the index file. The .tmp file is no longer created:
|
The index file do nothing in extraction What and why you are doing? |
I'm doing that because the backup files are stored in the cloud (BackBlaze) and downloading the data again costs lots of money (5 TB of data downloads every time doesn't make sense). So I use the index files. I would like to avoid using rclone as it introduces another point of potential failure. So the index files are a perfect solution to that problem. |
Franco, I’m syncing .ZPAQ archives to a remote server using Syncthing. While new archive parts (archive0002.zpaq, etc.) are being created, Syncthing attempts to read them to generate hashes. It detects that they are currently being modified and starts over, leading to a repeated cycle.
Is there an option for ZPAQFRANZ to create files with a .tmp extension initially, and then only change the extension to .zpaq once the part is fully completed?
Syncthing doesn’t support exclusions based on date, and the developers refuse to work on such a feature.
The text was updated successfully, but these errors were encountered: