Skip to content

Future Data Reentrancy

Andrey Kurilov edited this page Mar 14, 2017 · 1 revision

New data items

The tool should be able to create large objects (1GiB, 10GiB and more) with high level of concurrency (up to 1 million). This means that the tool should stream the produced data directly, avoiding excessive memory allocation. On the other hand, if one produces and sends only one byte every time the execution rate will be too low because of frequent method calls and switching. Mongoose tool pre-produces some fixed amount of custom data into the memory. Xorshift algorithm is used for data pre-production as far as it have been found the fastest. This data buffer acts like a circular one. Every data item is defined by the offset in the data ring which equals minor <32 bits of the unique 64 bit data item identifier:

ringOffsetk = k mod sizering

Clone this wiki locally