-
Notifications
You must be signed in to change notification settings - Fork 184
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Google Maps example needed #101
Comments
Thanks for submitting this issue. I'll try to answer in my best knowledge.
First off, I have to remark that TIFFs are a generic file format, providing options for a very large field of use cases. This is great for interoperability but comes at a cost of an overhead. The simplest example in my opinion is an RGB JPEG encoded TIFF. It has the almost exact same file as a simple JPEG of the same data, just packed in the TIFF headers/IFDs. For other cases the overhead to natively supported file types like PNG/JPEG can be even bigger. What I want to point out is that TIFFs might not be the best option for every use case, especially if you consider the extra cost you have to pay when you want to use them.
Unfortunately, preflight requests are an essential part of the CORS mechanism built into all modern browsers. They always sent when the origin of the website is not the same as the resource and the request is not "Simple", i.e when a specific header like There are some ways to mitigate that problem a little. One way is to ensure that the resource is accessible via a URL that is of the same origin as the main HTML page. I'm not sure if this is easily configured in S3, my guess is that there is some smart rerouting capability to be configured somewhere. As a last resort, a proxy can be used, that runs under the same origin.
I'm a little confused here, as this does not seem to make sense. Using compression results in smaller file sizes, thus should increase the download performance, not the other way around. Maybe there is an additional component here that I don't see. For TIFFs: Using DEFLATE should in most case result in smaller tile/row sizes, thus reducing the amount of raw bytes that need to be transmitted. Compression ratio heavily depends on the actual content of the file, used predictor and tiling scheme. You have to benchmark for your use-case whether there is a net gain in performance.
I'm sorry, but that is not possible. COG explorer is hosted on Github itself and has no associated data next to it (So always uses CORS requests to access remote files). You can however, as is the nature of the open source software, put the bundled application and place it next to your data files to compare the different options. In that case no Additionally: as already mentioned, there is no single "optimal" TIFF configuration, suitable for all needs, as several factors need to be weighed.
I did not benchmark the cloud storage providers and myself do not have any experience with performance tuning the access. I think I can somewhat understand the additional overhead of the I hope I could answer your questions in a satisfactory manner. Thank you for your interest in this project! |
Thank you Fabian for your detailed answer. I will start now finding solutions for each problem to improve speed. I think the approach of range reading on the client-side is the future, and NASA and ESA are, as you know, also pursuing this because it's so much easier than setting up GDAL servers for tiling millions of files for each zoom level. If I can bring the responsiveness to parity with pre-rendered tiles, then your library is the future, I'll keep you updated on this issue. |
You're welcome and I wish you best luck for your ventures. I've just returned from this years foss4g and this talk by Ivàn Sanchez Ortega was eye opening: https://media.ccc.de/v/bucharest-348-breaking-the-curse-of-raster-processing-software-as-a-service There are hurdles fo using such a technology, unfortunately, but I think too that this is the way to go. |
I saw your foss4g presentation (very nice) and as a coincidence I am also from Bucharest, but now I live in Munich. Thanks for the extra link, I am watching it now. I will post my findings on this thread, as I predict it to be a big breakthrough that absolutely needs open sourcing. |
Here is what solutions I found:
this.map = new Map({
layers: [
...
],
// Improve user experience by loading tiles while animating. Will make
// animations stutter on mobile or slow devices.
loadTilesWhileAnimating: true,
loadTilesWhileInteracting: true,
view: new View({
...
}),
});
import { fromUrl } from 'geotiff/dist/geotiff.bundle.min.js';
|
Hey all, I am very new to this topic, and currently I am trying to host a geotiff on my website with google maps javascript API, using geotiff.js. @philip-firstorder Has mentioned: Can you elaborate? I am currently confused regarding how to use geotiff.js with google maps. I am able to read the geotiff and fetch the attributes of my geotiff using geotiff.js. And since google maps only supports raster tilling natively, how do I stream the cog data to google maps? |
I implemented GeoTiff with GoogleMaps, works in production for over a year now. However the code is copyrighted and I cannot opensource it. |
Hey @philip-firstorder , |
Hey @philip-firstorder , @neelduttahere |
Problem:
I want to test the performance of geotiff.js on google maps vs reading pre-rendered tiles directly from s3.
I loaded my tiff in COG Explorer, but it‘s 3x slower than loading pre-rendered png tiles from s3 in Google maps.
What I expect:
To have similar performance reading from a single tiff with geotiff.js as opposed to load pre-rendered png tiles.
Reasons for delay:
The text was updated successfully, but these errors were encountered: