Performance Testing Approach #1407
joocer
started this conversation in
Improvements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Performance Testing is required to ensure tuning and optimizations provide overall benefit and avoid situations where optimizing one case is materially to the detriment of others.
The performance test rig(s) need not be high specification, and a mix of rigs should be used:
This is not comprehensive, but will provide a view across a spectrum of servers.
The performance test will run a battery of about 100 statements, each run 100 times. This will provide a view of the impact of changes on a range of queries and provide enough samples of each to gather meaningful mean, p50 and p95 statistics, stats will be collected for each query, for groups of queries with common characteristics and for the complete set.
The batch of tests will be run 3 times for each server on a nightly schedule (0400?)
The run times will generate control charts, the which will be monitored for variance and tickets raised for outliers or variances. This will be a new service written to run on cloud run. This will likely inform improvements to Opteryx in order to simplify this type of analysis directly in the engine.
Beta Was this translation helpful? Give feedback.
All reactions