Gathering detailed insights and metrics for @octokit/plugin-throttling
Gathering detailed insights and metrics for @octokit/plugin-throttling
Gathering detailed insights and metrics for @octokit/plugin-throttling
Gathering detailed insights and metrics for @octokit/plugin-throttling
Octokit plugin for GitHub’s recommended request throttling
npm install @octokit/plugin-throttling
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
111 Stars
816 Commits
35 Forks
11 Watching
6 Branches
30 Contributors
Updated on 26 Nov 2024
TypeScript (92.4%)
JavaScript (7.6%)
Cumulative downloads
Total Downloads
Last day
-5.1%
362,070
Compared to previous day
Last week
4.1%
2,169,116
Compared to previous week
Last month
7.4%
9,062,108
Compared to previous month
Last year
122.2%
85,800,702
Compared to previous year
2
1
Octokit plugin for GitHub’s recommended request throttling
Implements all recommended best practices to prevent hitting secondary rate limits.
Browsers |
Load
|
---|---|
Node |
Install with
|
[!IMPORTANT] As we use conditional exports, you will need to adapt your
tsconfig.json
by setting"moduleResolution": "node16", "module": "node16"
.See the TypeScript docs on package.json "exports".
See this helpful guide on transitioning to ESM from @sindresorhus
The code below creates a "Hello, world!" issue on every repository in a given organization. Without the throttling plugin it would send many requests in parallel and would hit rate limits very quickly. But the @octokit/plugin-throttling
slows down your requests according to the official guidelines, so you don't get blocked before your quota is exhausted.
The throttle.onSecondaryRateLimit
and throttle.onRateLimit
options are required. Return true
to automatically retry the request after retryAfter
seconds.
1const MyOctokit = Octokit.plugin(throttling); 2 3const octokit = new MyOctokit({ 4 auth: `secret123`, 5 throttle: { 6 onRateLimit: (retryAfter, options, octokit, retryCount) => { 7 octokit.log.warn( 8 `Request quota exhausted for request ${options.method} ${options.url}`, 9 ); 10 11 if (retryCount < 1) { 12 // only retries once 13 octokit.log.info(`Retrying after ${retryAfter} seconds!`); 14 return true; 15 } 16 }, 17 onSecondaryRateLimit: (retryAfter, options, octokit) => { 18 // does not retry, only logs a warning 19 octokit.log.warn( 20 `SecondaryRateLimit detected for request ${options.method} ${options.url}`, 21 ); 22 }, 23 }, 24}); 25 26async function createIssueOnAllRepos(org) { 27 const repos = await octokit.paginate( 28 octokit.repos.listForOrg.endpoint({ org }), 29 ); 30 return Promise.all( 31 repos.map(({ name }) => 32 octokit.issues.create({ 33 owner, 34 repo: name, 35 title: "Hello, world!", 36 }), 37 ), 38 ); 39}
Pass { throttle: { enabled: false } }
to disable this plugin.
Enabling Clustering support ensures that your application will not go over rate limits across Octokit instances and across Nodejs processes.
First install either redis
or ioredis
:
# NodeRedis (https://github.com/NodeRedis/node_redis)
npm install --save redis
# or ioredis (https://github.com/luin/ioredis)
npm install --save ioredis
Then in your application:
1import Bottleneck from "bottleneck";
2import Redis from "redis";
3
4const client = Redis.createClient({
5 /* options */
6});
7const connection = new Bottleneck.RedisConnection({ client });
8connection.on("error", err => console.error(err));
9
10const octokit = new MyOctokit({
11 auth: 'secret123'
12 throttle: {
13 onSecondaryRateLimit: (retryAfter, options, octokit) => {
14 /* ... */
15 },
16 onRateLimit: (retryAfter, options, octokit) => {
17 /* ... */
18 },
19
20 // The Bottleneck connection object
21 connection,
22
23 // A "throttling ID". All octokit instances with the same ID
24 // using the same Redis server will share the throttling.
25 id: "my-super-app",
26
27 // Otherwise the plugin uses a lighter version of Bottleneck without Redis support
28 Bottleneck
29 }
30});
31
32// To close the connection and allow your application to exit cleanly:
33await connection.disconnect();
To use the ioredis
library instead:
1import Redis from "ioredis"; 2const client = new Redis({ 3 /* options */ 4}); 5const connection = new Bottleneck.IORedisConnection({ client }); 6connection.on("error", (err) => console.error(err));
name | type | description |
---|---|---|
options.retryAfterBaseValue
|
Number
|
Number of milliseconds that will be used to multiply the time to wait based on `retry-after` or `x-ratelimit-reset` headers. Defaults to 1000
|
options.fallbackSecondaryRateRetryAfter
|
Number
|
Number of seconds to wait until retrying a request in case a secondary rate limit is hit and no retry-after header was present in the response. Defaults to 60
|
options.connection
|
Bottleneck.RedisConnection
| A Bottleneck connection instance. See Clustering above. |
options.id
|
string
|
A "throttling ID". All octokit instances with the same ID using the same Redis server will share the throttling. See Clustering above. Defaults to no-id .
|
options.Bottleneck
|
Bottleneck
| Bottleneck constructor. See Clustering above. Defaults to `bottleneck/light`. |
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
no dangerous workflow patterns detected
Reason
all changesets reviewed
Reason
19 commit(s) and 1 issue activity found in the last 90 days -- score normalized to 10
Reason
license file detected
Details
Reason
packaging workflow detected
Details
Reason
0 existing vulnerabilities detected
Reason
SAST tool is run on all commits
Details
Reason
security policy file detected
Details
Reason
dependency not pinned by hash detected -- score normalized to 5
Details
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
project is not fuzzed
Details
Score
Last Scanned on 2024-11-18
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More