Elasticsearch datasource limited to 10,000 records max #56255
Replies: 9 comments 1 reply
-
Cannot see that we solve this for the only purpose of exporting to CSV - it's probably a lot easier to write some custom code which query elasticsearch and generates a CSV file. The current table panel is not built for processing and render over 40k records in the browser. However, this is good input for the upcoming work of integrating Elasticsearch logs in Explore (#15999) where I see this functionality fit better. |
Beta Was this translation helpful? Give feedback.
-
It would be nice if you guys would at least warn the user that the result was truncated |
Beta Was this translation helpful? Give feedback.
-
Scroll API is a good idea. In addition, we don't wait for the loading of all data. |
Beta Was this translation helpful? Give feedback.
-
Scroll API is not thought for online work for that purpouse exists "search after" https://www.elastic.co/guide/en/elasticsearch/reference/6.8/search-request-search-after.html |
Beta Was this translation helpful? Give feedback.
-
The short term fix could be to simply lift the hardcoded limit, but we need to check the performance implications. Longer term we should look into API options. AFAIK, we don't have the concept of datasource-based pagination yet. It would mean allowing the table panel to not just paginate on the initial response data but to request more data. Needs a bit of thought and could be interesting to other datasources as well. |
Beta Was this translation helpful? Give feedback.
-
Hi all, @davkal, is there any solution or the situation is still the same? |
Beta Was this translation helpful? Give feedback.
-
to summarize the situation: elastic will not return more than 10000 (can be configured in elastic-config) results at the same time. also, the visualizations in grafana were not designed for rendering millions of rows for example, so perhaps there are two separate features:
|
Beta Was this translation helpful? Give feedback.
-
I think being able to go over the 10000 limit would be a good idea, despite the fact that visualizations in Grafana are not designed to render millions of rows. There are certainly use-cases in which this would be useful, for example, searching for a large amount of data, then visualizing a small subset of that data through Grafana's Transform feature (filter data by values etc). |
Beta Was this translation helpful? Give feedback.
-
Hello, as you may have heard, we are transitioning away from using discussions to discuss feature requests. We are migrating this discussion to an issue and closing the discussion. The issue is #83016. Feel free to continue the discussion around this there. Thank you! |
Beta Was this translation helpful? Give feedback.
-
Current situation:
Elasticsearch datasource silently truncates result list to 10,000 items.
tried to increase the Size parameter of the query over 10K, but then the query returns almost instantly with 0 results and the panel shows "No data points".
It works in Kibana. We can get the list of records from the index for a whole month, total 47,131 records.
What would you like to be added:
The API returns results paginated by 10,000 records at a time exactly. There is a "scroll" API that the client needs to use to pull all the records set.
https://discuss.elastic.co/t/how-to-get-data-more-than-10000-in-elasticsearch/107869
Why is this needed:
The use-case is a query returning all entries from an index for a given time range and presenting it in the Table panel for the Download to CSV feature.
Also silently truncating results may be regarded as a bad thing.
Beta Was this translation helpful? Give feedback.
All reactions