Gathering detailed insights and metrics for limiter
Gathering detailed insights and metrics for limiter
Gathering detailed insights and metrics for limiter
Gathering detailed insights and metrics for limiter
A generic rate limiter for node.js. Useful for API clients, web crawling, or other tasks that need to be throttled
npm install limiter
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
1,511 Stars
87 Commits
135 Forks
15 Watching
1 Branches
15 Contributors
Updated on 27 Nov 2024
Minified
Minified + Gzipped
TypeScript (95.73%)
JavaScript (4.27%)
Cumulative downloads
Total Downloads
Last day
-2.5%
1,235,405
Compared to previous day
Last week
2.4%
6,474,837
Compared to previous week
Last month
13.9%
26,899,052
Compared to previous month
Last year
16.4%
267,733,979
Compared to previous year
1
Provides a generic rate limiter for the web and node.js. Useful for API clients, web crawling, or other tasks that need to be throttled. Two classes are exposed, RateLimiter and TokenBucket. TokenBucket provides a lower level interface to rate limiting with a configurable burst rate and drip rate. RateLimiter sits on top of the token bucket and adds a restriction on the maximum number of tokens that can be removed each interval to comply with common API restrictions such as "150 requests per hour maximum".
yarn add limiter
A simple example allowing 150 requests per hour:
1import { RateLimiter } from "limiter"; 2 3// Allow 150 requests per hour (the Twitter search limit). Also understands 4// 'second', 'minute', 'day', or a number of milliseconds 5const limiter = new RateLimiter({ tokensPerInterval: 150, interval: "hour" }); 6 7async function sendRequest() { 8 // This call will throw if we request more than the maximum number of requests 9 // that were set in the constructor 10 // remainingRequests tells us how many additional requests could be sent 11 // right this moment 12 const remainingRequests = await limiter.removeTokens(1); 13 callMyRequestSendingFunction(...); 14}
Another example allowing one message to be sent every 250ms:
1import { RateLimiter } from "limiter"; 2 3const limiter = new RateLimiter({ tokensPerInterval: 1, interval: 250 }); 4 5async function sendMessage() { 6 const remainingMessages = await limiter.removeTokens(1); 7 callMyMessageSendingFunction(...); 8}
The default behaviour is to wait for the duration of the rate limiting that's
currently in effect before the promise is resolved, but if you pass in
"fireImmediately": true
, the promise will be resolved immediately with
remainingRequests
set to -1:
1import { RateLimiter } from "limiter"; 2 3const limiter = new RateLimiter({ 4 tokensPerInterval: 150, 5 interval: "hour", 6 fireImmediately: true 7}); 8 9async function requestHandler(request, response) { 10 // Immediately send 429 header to client when rate limiting is in effect 11 const remainingRequests = await limiter.removeTokens(1); 12 if (remainingRequests < 0) { 13 response.writeHead(429, {'Content-Type': 'text/plain;charset=UTF-8'}); 14 response.end('429 Too Many Requests - your IP is being rate limited'); 15 } else { 16 callMyMessageSendingFunction(...); 17 } 18}
A synchronous method, tryRemoveTokens(), is available in both RateLimiter and TokenBucket. This will return immediately with a boolean value indicating if the token removal was successful.
1import { RateLimiter } from "limiter"; 2 3const limiter = new RateLimiter({ tokensPerInterval: 10, interval: "second" }); 4 5if (limiter.tryRemoveTokens(5)) 6 console.log('Tokens removed'); 7else 8 console.log('No tokens removed');
To get the number of remaining tokens outside the removeTokens
promise,
simply use the getTokensRemaining
method.
1import { RateLimiter } from "limiter"; 2 3const limiter = new RateLimiter({ tokensPerInterval: 1, interval: 250 }); 4 5// Prints 1 since we did not remove a token and our number of tokens per 6// interval is 1 7console.log(limiter.getTokensRemaining());
Using the token bucket directly to throttle at the byte level:
1import { TokenBucket } from "limiter"; 2 3const BURST_RATE = 1024 * 1024 * 150; // 150KB/sec burst rate 4const FILL_RATE = 1024 * 1024 * 50; // 50KB/sec sustained rate 5 6// We could also pass a parent token bucket in to create a hierarchical token 7// bucket 8// bucketSize, tokensPerInterval, interval 9const bucket = new TokenBucket({ 10 bucketSize: BURST_RATE, 11 tokensPerInterval: FILL_RATE, 12 interval: "second" 13}); 14 15async function handleData(myData) { 16 await bucket.removeTokens(myData.byteLength); 17 sendMyData(myData); 18}
Both the token bucket and rate limiter should be used with a message queue or some way of preventing multiple simultaneous calls to removeTokens(). Otherwise, earlier messages may get held up for long periods of time if more recent messages are continually draining the token bucket. This can lead to out of order messages or the appearance of "lost" messages under heavy load.
MIT License
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
license file detected
Details
Reason
3 existing vulnerabilities detected
Details
Reason
Found 7/15 approved changesets -- score normalized to 4
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
project is not fuzzed
Details
Reason
security policy file not detected
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Score
Last Scanned on 2024-11-18
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More