You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, the useGetPubmedIDs hook assumes that we are only able to submit 1,500 requests at once (500 ids per request, 3 requests per second). I initially foolishly implemented this to just have a max of 1,500. However, we can use the executeHTTPRequestsInBatches function to overwrite this so that we can accept any number of HTTP requests.
In the future, we can also add a Pubmed API key to increase the rate limit to 10 requests per second if our current volume is not enough. I believe we already add a pubmed API key in the sleuth import, so it would be trivial to just do that here too.
Finally, this hooks needs to be turned into a useMutation query, to allow for more granularity and control when it comes to querying for IDs.
The text was updated successfully, but these errors were encountered:
Currently, the useGetPubmedIDs hook assumes that we are only able to submit 1,500 requests at once (500 ids per request, 3 requests per second). I initially foolishly implemented this to just have a max of 1,500. However, we can use the executeHTTPRequestsInBatches function to overwrite this so that we can accept any number of HTTP requests.
In the future, we can also add a Pubmed API key to increase the rate limit to 10 requests per second if our current volume is not enough. I believe we already add a pubmed API key in the sleuth import, so it would be trivial to just do that here too.
Finally, this hooks needs to be turned into a useMutation query, to allow for more granularity and control when it comes to querying for IDs.
The text was updated successfully, but these errors were encountered: