Gathering detailed insights and metrics for undici-types
Gathering detailed insights and metrics for undici-types
Gathering detailed insights and metrics for undici-types
Gathering detailed insights and metrics for undici-types
An HTTP/1.1 client, written from scratch for Node.js
npm install undici-types
100
Supply Chain
71.6
Quality
92.6
Maintenance
100
Vulnerability
100
License
Module System
Unable to determine the module system for this package.
Min. Node Version
Typescript Support
Node Version
NPM Version
6,273 Stars
3,227 Commits
551 Forks
52 Watching
75 Branches
299 Contributors
Updated on 28 Nov 2024
JavaScript (96.71%)
TypeScript (3.28%)
Shell (0.01%)
Cumulative downloads
Total Downloads
Last day
2.4%
9,529,350
Compared to previous day
Last week
5.4%
52,258,409
Compared to previous week
Last month
-18.8%
214,358,439
Compared to previous month
Last year
1,338.5%
1,761,357,129
Compared to previous year
No dependencies detected.
An HTTP/1.1 client, written from scratch for Node.js.
Undici means eleven in Italian. 1.1 -> 11 -> Eleven -> Undici. It is also a Stranger Things reference.
Have a question about using Undici? Open a Q&A Discussion or join our official OpenJS Slack channel.
Looking to contribute? Start by reading the contributing guide
npm i undici
The benchmark is a simple getting data example using a 50 TCP connections with a pipelining depth of 10 running on Node 22.11.0.
┌────────────────────────┬─────────┬────────────────────┬────────────┬─────────────────────────┐
│ Tests │ Samples │ Result │ Tolerance │ Difference with slowest │
├────────────────────────┼─────────┼────────────────────┼────────────┼─────────────────────────┤
│ 'axios' │ 15 │ '5708.26 req/sec' │ '± 2.91 %' │ '-' │
│ 'http - no keepalive' │ 10 │ '5809.80 req/sec' │ '± 2.30 %' │ '+ 1.78 %' │
│ 'request' │ 30 │ '5828.80 req/sec' │ '± 2.91 %' │ '+ 2.11 %' │
│ 'undici - fetch' │ 40 │ '5903.78 req/sec' │ '± 2.87 %' │ '+ 3.43 %' │
│ 'node-fetch' │ 10 │ '5945.40 req/sec' │ '± 2.13 %' │ '+ 4.15 %' │
│ 'got' │ 35 │ '6511.45 req/sec' │ '± 2.84 %' │ '+ 14.07 %' │
│ 'http - keepalive' │ 65 │ '9193.24 req/sec' │ '± 2.92 %' │ '+ 61.05 %' │
│ 'superagent' │ 35 │ '9339.43 req/sec' │ '± 2.95 %' │ '+ 63.61 %' │
│ 'undici - pipeline' │ 50 │ '13364.62 req/sec' │ '± 2.93 %' │ '+ 134.13 %' │
│ 'undici - stream' │ 95 │ '18245.36 req/sec' │ '± 2.99 %' │ '+ 219.63 %' │
│ 'undici - request' │ 50 │ '18340.17 req/sec' │ '± 2.84 %' │ '+ 221.29 %' │
│ 'undici - dispatch' │ 40 │ '22234.42 req/sec' │ '± 2.94 %' │ '+ 289.51 %' │
└────────────────────────┴─────────┴────────────────────┴────────────┴─────────────────────────┘
1import { request } from 'undici' 2 3const { 4 statusCode, 5 headers, 6 trailers, 7 body 8} = await request('http://localhost:3000/foo') 9 10console.log('response received', statusCode) 11console.log('headers', headers) 12 13for await (const data of body) { console.log('data', data) } 14 15console.log('trailers', trailers)
The body
mixins are the most common way to format the request/response body. Mixins include:
[!NOTE] The body returned from
undici.request
does not implement.formData()
.
Example usage:
1import { request } from 'undici' 2 3const { 4 statusCode, 5 headers, 6 trailers, 7 body 8} = await request('http://localhost:3000/foo') 9 10console.log('response received', statusCode) 11console.log('headers', headers) 12console.log('data', await body.json()) 13console.log('trailers', trailers)
Note: Once a mixin has been called then the body cannot be reused, thus calling additional mixins on .body
, e.g. .body.json(); .body.text()
will result in an error TypeError: unusable
being thrown and returned through the Promise
rejection.
Should you need to access the body
in plain-text after using a mixin, the best practice is to use the .text()
mixin first and then manually parse the text to the desired format.
For more information about their behavior, please reference the body mixin from the Fetch Standard.
This section documents our most commonly used API methods. Additional APIs are documented in their own files within the docs folder and are accessible via the navigation list on the left side of the docs site.
undici.request([url, options]): Promise
Arguments:
string | URL | UrlObject
RequestOptions
Dispatcher
- Default: getGlobalDispatcherString
- Default: PUT
if options.body
, otherwise GET
Returns a promise with the result of the Dispatcher.request
method.
Calls options.dispatcher.request(options)
.
See Dispatcher.request for more details, and request examples for examples.
undici.stream([url, options, ]factory): Promise
Arguments:
string | URL | UrlObject
StreamOptions
Dispatcher
- Default: getGlobalDispatcherString
- Default: PUT
if options.body
, otherwise GET
Dispatcher.stream.factory
Returns a promise with the result of the Dispatcher.stream
method.
Calls options.dispatcher.stream(options, factory)
.
See Dispatcher.stream for more details.
undici.pipeline([url, options, ]handler): Duplex
Arguments:
string | URL | UrlObject
PipelineOptions
Dispatcher
- Default: getGlobalDispatcherString
- Default: PUT
if options.body
, otherwise GET
Dispatcher.pipeline.handler
Returns: stream.Duplex
Calls options.dispatch.pipeline(options, handler)
.
See Dispatcher.pipeline for more details.
undici.connect([url, options]): Promise
Starts two-way communications with the requested resource using HTTP CONNECT.
Arguments:
string | URL | UrlObject
ConnectOptions
Dispatcher
- Default: getGlobalDispatcher(err: Error | null, data: ConnectData | null) => void
(optional)Returns a promise with the result of the Dispatcher.connect
method.
Calls options.dispatch.connect(options)
.
See Dispatcher.connect for more details.
undici.fetch(input[, init]): Promise
Implements fetch.
Basic usage example:
1import { fetch } from 'undici' 2 3 4const res = await fetch('https://example.com') 5const json = await res.json() 6console.log(json)
You can pass an optional dispatcher to fetch
as:
1import { fetch, Agent } from 'undici' 2 3const res = await fetch('https://example.com', { 4 // Mocks are also supported 5 dispatcher: new Agent({ 6 keepAliveTimeout: 10, 7 keepAliveMaxTimeout: 10 8 }) 9}) 10const json = await res.json() 11console.log(json)
request.body
A body can be of the following types:
In this implementation of fetch, request.body
now accepts Async Iterables
. It is not present in the Fetch Standard.
1import { fetch } from 'undici' 2 3const data = { 4 async *[Symbol.asyncIterator]() { 5 yield 'hello' 6 yield 'world' 7 }, 8} 9 10await fetch('https://example.com', { body: data, method: 'POST', duplex: 'half' })
FormData besides text data and buffers can also utilize streams via Blob objects:
1import { openAsBlob } from 'node:fs' 2 3const file = await openAsBlob('./big.csv') 4const body = new FormData() 5body.set('file', file, 'big.csv') 6 7await fetch('http://example.com', { method: 'POST', body })
request.duplex
'half'
In this implementation of fetch, request.duplex
must be set if request.body
is ReadableStream
or Async Iterables
, however, even though the value must be set to 'half'
, it is actually a full duplex. For more detail refer to the Fetch Standard.
response.body
Nodejs has two kinds of streams: web streams, which follow the API of the WHATWG web standard found in browsers, and an older Node-specific streams API. response.body
returns a readable web stream. If you would prefer to work with a Node stream you can convert a web stream using .fromWeb()
.
1import { fetch } from 'undici' 2import { Readable } from 'node:stream' 3 4const response = await fetch('https://example.com') 5const readableWebStream = response.body 6const readableNodeStream = Readable.fromWeb(readableWebStream)
This section documents parts of the Fetch Standard that Undici does not support or does not fully implement.
The Fetch Standard allows users to skip consuming the response body by relying on garbage collection to release connection resources. Undici does not do the same. Therefore, it is important to always either consume or cancel the response body.
Garbage collection in Node is less aggressive and deterministic (due to the lack of clear idle periods that browsers have through the rendering refresh rate) which means that leaving the release of connection resources to the garbage collector can lead to excessive connection usage, reduced performance (due to less connection re-use), and even stalls or deadlocks when running out of connections.
1// Do 2const headers = await fetch(url) 3 .then(async res => { 4 for await (const chunk of res.body) { 5 // force consumption of body 6 } 7 return res.headers 8 }) 9 10// Do not 11const headers = await fetch(url) 12 .then(res => res.headers)
However, if you want to get only headers, it might be better to use HEAD
request method. Usage of this method will obviate the need for consumption or cancelling of the response body. See MDN - HTTP - HTTP request methods - HEAD for more details.
1const headers = await fetch(url, { method: 'HEAD' }) 2 .then(res => res.headers)
The Fetch Standard requires implementations to exclude certain headers from requests and responses. In browser environments, some headers are forbidden so the user agent remains in full control over them. In Undici, these constraints are removed to give more control to the user.
undici.upgrade([url, options]): Promise
Upgrade to a different protocol. See MDN - HTTP - Protocol upgrade mechanism for more details.
Arguments:
string | URL | UrlObject
UpgradeOptions
Dispatcher
- Default: getGlobalDispatcher(error: Error | null, data: UpgradeData) => void
(optional)Returns a promise with the result of the Dispatcher.upgrade
method.
Calls options.dispatcher.upgrade(options)
.
See Dispatcher.upgrade for more details.
undici.setGlobalDispatcher(dispatcher)
Dispatcher
Sets the global dispatcher used by Common API Methods.
undici.getGlobalDispatcher()
Gets the global dispatcher used by Common API Methods.
Returns: Dispatcher
undici.setGlobalOrigin(origin)
string | URL | undefined
Sets the global origin used in fetch
.
If undefined
is passed, the global origin will be reset. This will cause Response.redirect
, new Request()
, and fetch
to throw an error when a relative path is passed.
1setGlobalOrigin('http://localhost:3000') 2 3const response = await fetch('/api/ping') 4 5console.log(response.url) // http://localhost:3000/api/ping
undici.getGlobalOrigin()
Gets the global origin used in fetch
.
Returns: URL
UrlObject
string | number
(optional)string
(optional)string
(optional)string
(optional)string
(optional)string
(optional)string
(optional)This section documents parts of the HTTP/1.1 specification that Undici does not support or does not fully implement.
Undici does not support the Expect
request header field. The request
body is always immediately sent and the 100 Continue
response will be
ignored.
Refs: https://tools.ietf.org/html/rfc7231#section-5.1.1
Undici will only use pipelining if configured with a pipelining
factor
greater than 1
. Also it is important to pass blocking: false
to the
request options to properly pipeline requests.
Undici always assumes that connections are persistent and will immediately pipeline requests, without checking whether the connection is persistent. Hence, automatic fallback to HTTP/1.0 or HTTP/1.1 without pipelining is not supported.
Undici will immediately pipeline when retrying requests after a failed connection. However, Undici will not retry the first remaining requests in the prior pipeline and instead error the corresponding callback/promise/stream.
Undici will abort all running requests in the pipeline when any of them are aborted.
Since it is not possible to manually follow an HTTP redirect on the server-side,
Undici returns the actual response instead of an opaqueredirect
filtered one
when invoked with a manual
redirect. This aligns fetch()
with the other
implementations in Deno and Cloudflare Workers.
Refs: https://fetch.spec.whatwg.org/#atomic-http-redirect-handling
If you experience problem when connecting to a remote server that is resolved by your DNS servers to a IPv6 (AAAA record)
first, there are chances that your local router or ISP might have problem connecting to IPv6 networks. In that case
undici will throw an error with code UND_ERR_CONNECT_TIMEOUT
.
If the target server resolves to both a IPv6 and IPv4 (A records) address and you are using a compatible Node version
(18.3.0 and above), you can fix the problem by providing the autoSelectFamily
option (support by both undici.request
and undici.Agent
) which will enable the family autoselection algorithm when establishing the connection.
MIT
No vulnerabilities found.
Reason
25 out of 25 merged PRs checked by a CI test -- score normalized to 10
Reason
project has 81 contributing companies or organizations
Details
Reason
no dangerous workflow patterns detected
Reason
update tool detected
Details
Reason
project is fuzzed
Details
Reason
license file detected
Details
Reason
30 commit(s) and 26 issue activity found in the last 90 days -- score normalized to 10
Reason
packaging workflow detected
Details
Reason
SAST tool is run on all commits
Details
Reason
0 existing vulnerabilities detected
Reason
security policy file detected
Details
Reason
binaries present in source code
Details
Reason
Found 23/28 approved changesets -- score normalized to 8
Reason
dependency not pinned by hash detected -- score normalized to 3
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
detected GitHub workflow tokens with excessive permissions
Details
Score
Last Scanned on 2024-11-27T14:31:14Z
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More