Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Workaround for chrome bug net::ERR_CACHE_OPERATION_NOT_SUPPORTED #67

Open
sroboubi opened this issue Sep 14, 2018 · 2 comments
Open

Workaround for chrome bug net::ERR_CACHE_OPERATION_NOT_SUPPORTED #67

sroboubi opened this issue Sep 14, 2018 · 2 comments

Comments

@sroboubi
Copy link

When loading a tiff with DEFLATE compression geotiff.js makes a large number of partial GET requests (around 20 to 30). I assume the content is just distributed that way throughout the file. This seems to be triggering a bug in chrome, where the browser tries to satisfy some of these requests with parts of previous requests, and it fails. Related links:

webtorrent/webtorrent#1193 (comment)
mozilla/pdf.js#9022
https://bugs.chromium.org/p/chromium/issues/detail?id=770694

Would it be possible to implement some sort of work around in geotiff.js? For example, a simple one would be to allow the caller to pass in additional headers that can get added to requests, which could set the no-cache header (https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control). Or perhaps implement a retry on a per GET basis to catch this specific error?

@sroboubi
Copy link
Author

After looking closer at the code it seems like there is already a way to specify the request header through options passed into GeoTIFF.fromUrl. After adding the following headers I can no longer see the error, however everything slows down, as expected:

const tiff = await GeoTIFF.fromUrl(fileServer + remoteFilename, {headers: {"Cache-Control": "no-cache, no-store"}});

Please let me know if you think there is a better workaround, such as adding a retry in source.js when calling fetch()?

@constantinius
Copy link
Member

Hi @sroboubi,

This seems like a tricky issue and is related to the reasons why I don't like the current implementation of the blocked source.

First off: I don't think that the sending of many, many block requests is viable, as browsers set a rather harsh limit of concurrent connections per host. There is a small improvement on that already implemented: consecutive blocks are requested with a single request, but if there are 'holes' between the blocks, then many requests are sent, which is the behavior you are experiencing.

There is also the possibility to request multiple patches of data with the HTTP range request, where the response is returned as an HTTP multipart, but I never really got around to implement that. I think this would be most suitable, as for each image read request, only one request is sent to the server.

Regarding the retry mechanism: I think that it is feasible and sensible to do so even if I'm not a big fan of circumventing browser issues. Unfortunately, I'm not really able to implement this right now, but I'm happily accepting PRs.

Thanks again

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants