-
Notifications
You must be signed in to change notification settings - Fork 239
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What should I do when my data volume is very large, more than 100 million? #383
Comments
throw more money to your postgres instance? |
@smockgithub this question is very vague. You are not defining what is performance nor which database you using. If you experience slowness, you maybe need a bigger instance or some caching mechanism. |
Can this be closed? Not sure if there are any other answers besides "ensure your indexes are meeting your needs" and "ensure the postgres instance is big enough" |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
When there is a lot of data, the relational table will generate a lot of data. How to deal with it to ensure performance?
The text was updated successfully, but these errors were encountered: