You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As a FinOps practitioner, I need to ingest data into a queryable data store in order to report on data at scale beyond $5M/mo
💎 Solution
Support large datasets (e.g., 500 GB/mo) with up to 7 years of historical data that refreshes when changed by adding an option to ingest data into Azure Data Explorer and update reporting to leverage that database.
📋 Tasks
The content you are editing has changed. Please copy your edits and refresh the page.
There was an internal analysis of the optimal data store to use for the largest datasets and Azure Data Explorer was deemed to be the best option that balanced cost, performance, and scale.
🙋♀️ Ask for the community
We could use your help:
Please vote this issue up (👍) to prioritize it.
Leave comments to help us solidify the vision.
The text was updated successfully, but these errors were encountered:
@t-esslinger Sorry for missing the comment. Yes, this is still in the backlog. We're making progress slowly. I'm reopening this issue to track everything needed.
📝 Scenario
As a FinOps practitioner, I need to ingest data into a queryable data store in order to report on data at scale beyond $5M/mo
💎 Solution
Support large datasets (e.g., 500 GB/mo) with up to 7 years of historical data that refreshes when changed by adding an option to ingest data into Azure Data Explorer and update reporting to leverage that database.
📋 Tasks
Required tasks
Stretch goals
ℹ️ Additional context
There was an internal analysis of the optimal data store to use for the largest datasets and Azure Data Explorer was deemed to be the best option that balanced cost, performance, and scale.
🙋♀️ Ask for the community
We could use your help:
The text was updated successfully, but these errors were encountered: