Skip to content

Commit

Permalink
Minor optimisation improvements
Browse files Browse the repository at this point in the history
  • Loading branch information
jamesbayly committed Oct 14, 2024
1 parent b7365d3 commit 888a105
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions docs/indexer/build/optimisation.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@ If you're looking for advice on how to run high performance SubQuery infrastruct
- Avoid using `blockHandlers` where possible. Using block handlers slows your project down as they can be executed with each and every block. Use them only if you need to and [consider adjusting project architecture](#review-project-architecture).
- If you must use a block handler, ensure that you carefully optimise every code path called by it. As it will be executed on each block the total time that it might take will increase linearly as the chain grows.
- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers.
- Always use a [dictionary](../academy/tutorials_examples/dictionary.html#how-does-a-subquery-dictionary-work) (we can help create one for your new network). You can see examples of how to create a dictionary in the [dictionary repository](https://github.com/subquery/subql-dictionary).
- Use filter conditions in your mapping handlers (within the project manifest) to reduce the number of events/transactions that need to be processed. Create filters as specific as possible to avoid querying unnecessary data.
- Set the start block in your project manifest to when the contract was initialised or when the first event/transaction occurs.
- For all major chains we already [provide valid SubQuery dictionaries](https://github.com/subquery/templates/blob/main/dist/dictionary.json), but if you’re indexing a custom chain, you may want to implement your own [dictionary](../academy/tutorials_examples/dictionary.html#how-does-a-subquery-dictionary-work) project to speed up your indexer (we can help create one for your new network). You can see examples of how to create a dictionary in the [dictionary repository](https://github.com/subquery/subql-dictionary).
- Use the strictest possible filter conditions in your mapping handlers (within the project manifest) to reduce the number of events/transactions that need to be processed. Create filters as specific as possible to avoid querying unnecessary data.
- Set the start block in your project manifest to when the contract was initialised or, better yet, when the first event/transaction occurs.
- Use `node worker threads` to move block fetching and block processing into its own worker thread. It could speed up indexing by up to 4 times (depending on the particular project). You can easily enable it using the `-workers=<number>` flag. Note that the number of available CPU cores strictly limits the usage of worker threads. [Read more here](../run_publish/references.html#w-workers).

## Other Improvements
Expand All @@ -39,9 +39,9 @@ type Transaction @entity {
```

- Use parallel/batch processing as often as possible.
- Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
- Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
- If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one (see example below). Read more in our [advanced access to the store documentation](../build/mapping/store.html).
- Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.

```shell
await Promise.all([
Expand Down

0 comments on commit 888a105

Please sign in to comment.