A plugin for superagent that throttles requests. Useful for rate or concurrency limited APIs.
- This doesn't just delay requests by an arbitrary number of ms, but intelligently manages requests so they're sent as soon as possible whilst staying beneath rate limits.
- Can make serialised subqueues on the fly.
- Follows superagent
.use(throttle.plugin())
architecture - Can use multiple instances
- includes builds for node4 LTS & superagent supported browsers
npm i --save ${pkg.name}
const request = require('superagent')
const Throttle = require('superagent-throttle')
let throttle = new Throttle({
active: true, // set false to pause queue
rate: 5, // how many requests can be sent every `ratePer`
ratePer: 10000, // number of ms in which `rate` requests may be sent
concurrent: 2 // how many requests can be sent concurrently
})
request
.get('http://placekitten.com/100/100')
.use(throttle.plugin())
.end((err, res) => { ... })
const request = require('superagent')
const Throttle = require('superagent-throttle')
let throttle = new Throttle()
.on('sent', (request) => { ... }) // sent a request
.on('received', (request) => { ... }) // received a response
.on('drained', () => { ... }) // received last response
// node 6
import Throttle from 'superagent-throttle'
// node 4
var Throttle = require('superagent-throttle/dist/node4')
// all browsers supported by superagent
var Throttle = require('superagent-throttle/dist/browser')
When using API's to update a client, you may want some serialised requests which
still count towards your rate limit, but do not block other requests. You can
do that by passing a uri (not necessarily a valid url) to throttle.plugin
, for
those requests you want to serialise, and leave it out for other async requests.
This can be done on the fly, you don't need to initialise subqueues first.
let endpoint = 'http://example.com/endpoint'
request
.get(endpoint)
.set('someData': someData)
.use(throttle.plugin(endpoint))
.end(callback)
it's common to use an endpoint for the uri, simply to serialise requests to that endpoint without interfering with requests to other endpoints
active
: whether or not the queue is paused. (default: true)rate
: how many requests can be sent everyratePer
. (default: 40)ratePer
: number of ms in whichrate
requests may be sent. (default: 40000)concurrent
: how many requests can be sent concurrently. (default: 20)
Options can be set after instantiation using the options
method.
var throttle = new require('./index')({ active: false }) // start paused
throttle.options('active', true) // unpause
${scripts()}
See the fancy annotated code.
- ES6 imports
- included compatibility builds
- switched to nock for test stubbing
- fixed bug where errored requests are not cleared from concurrency count (possibly related to issue #6)
- Removed extraneous dependencies
- Fancy ES6 Class definition
- Added unit tests
- Event emitter
- breaks 0.1.0 syntax
Levi Wheatcroft [email protected]
Contributions welcome; Please submit all pull requests against the master branch.