Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to improve the search speed for Ten-Million datas #9

Open
AliredDevils opened this issue Apr 21, 2020 · 1 comment
Open

how to improve the search speed for Ten-Million datas #9

AliredDevils opened this issue Apr 21, 2020 · 1 comment

Comments

@AliredDevils
Copy link

Now, i use the PostgreSQL as backend to store my datas which is ten million level.
And some tests for me,finding that:

1.Get note by name: cost about 5-6 seconds, it will be slower for me.

2.List notes in one project: cost about 70 million-seconds.

So i want to know how to improve the search speed for the note or occurrence?

Thanks

@argowang
Copy link

argowang commented Jun 8, 2021

You should consider change the data type from json to jsonb and create indexes on common fields you query.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants