Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace zsync2's HTTP backend #30

Open
TheAssassin opened this issue Dec 20, 2018 · 5 comments
Open

Replace zsync2's HTTP backend #30

TheAssassin opened this issue Dec 20, 2018 · 5 comments

Comments

@TheAssassin
Copy link
Member

We might have to switch away from CPR soon, as the project seems mostly dead, and doesn't compile on the latest distros any more. Also, the way it's used currently might not be very efficient, see #29.

If possible, we should try to use https://aria2.github.io/manual/en/html/libaria2.html instead. aria2 is a very intuitive tool to download files in parallel, with just a few flags, automagically. It's very configurable and flexible to use on the CLI, so perhaps its library libaria2 might be an option to implement more efficient downloading of the chunks.

@TheAssassin
Copy link
Member Author

libaria2 is more of a wrapper providing the same feature set as the aria2c CLI. While that wouldn't be so much of an issue, the lack of all features we need (e.g., HEAD requests, downloading into a custom buffer to allow for controlling how the file is written out) is. I think with some hacking (and the use of something like tmpfs) we could render this usable to some extent, but it's not very elegant nor robust, and adds constraints to target systems where AppImageUpdate shall be run on.

We must find another library.

@TheAssassin
Copy link
Member Author

Checking out https://www.boost.org/doc/libs/develop/libs/beast/example/http/client/sync/http_client_sync.cpp right now. It's quite low-level, which might allow for implementing a workflow that suits our needs best.

@ghuls
Copy link

ghuls commented Dec 21, 2018

Newer http libraries hopefully also support files bigger than 2GiB/4GiB. See: #31

@probonopd
Copy link
Member

I wouldn't consider libcurl "old" ;-)

@ghuls
Copy link

ghuls commented Dec 21, 2018

@probonopd libcurl is not the problem. It is the wrapper code around it, that does not handle big files.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants