Gathering detailed insights and metrics for @pnpm/node-fetch
Gathering detailed insights and metrics for @pnpm/node-fetch
Gathering detailed insights and metrics for @pnpm/node-fetch
Gathering detailed insights and metrics for @pnpm/node-fetch
A light-weight module that brings the Fetch API to Node.js
npm install @pnpm/node-fetch
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
540 Commits
1 Forks
1 Watching
10 Branches
37 Contributors
Updated on 19 Apr 2023
JavaScript (97.76%)
TypeScript (2.24%)
Cumulative downloads
Total Downloads
Last day
-32.8%
5,211
Compared to previous day
Last week
11.1%
37,024
Compared to previous week
Last month
-2.2%
139,712
Compared to previous month
Last year
698.4%
1,452,874
Compared to previous year
A light-weight module that brings Fetch API to Node.js.
Instead of implementing XMLHttpRequest
in Node.js to run browser-specific Fetch polyfill, why not go from native http
to fetch
API directly? Hence, node-fetch
, minimal code for a window.fetch
compatible API on Node.js runtime.
See Jason Miller's isomorphic-unfetch or Leonardo Quixada's cross-fetch for isomorphic usage (exports node-fetch
for server-side, whatwg-fetch
for client-side).
window.fetch
API.res.text()
and res.json()
) to UTF-8 automatically.window.fetch
offers, feel free to open an issue.Current stable release (3.x
)
1$ npm install node-fetch
1// CommonJS 2const fetch = require('node-fetch'); 3 4// ES Module 5import fetch from 'node-fetch';
If you want to patch the global object in node:
1const fetch = require('node-fetch'); 2 3if (!globalThis.fetch) { 4 globalThis.fetch = fetch; 5}
For versions of Node earlier than 12, use this globalThis
polyfill.
Using an old version of node-fetch? Check out the following files:
NOTE: The documentation below is up-to-date with 3.x
releases, if you are using an older version, please check how to upgrade.
1const fetch = require('node-fetch'); 2 3(async () => { 4 const response = await fetch('https://github.com/'); 5 const body = await response.text(); 6 7 console.log(body); 8})();
1const fetch = require('node-fetch'); 2 3(async () => { 4 const response = await fetch('https://api.github.com/users/github'); 5 const json = await response.json(); 6 7 console.log(json); 8})();
1const fetch = require('node-fetch'); 2 3(async () => { 4 const response = await fetch('https://httpbin.org/post', {method: 'POST', body: 'a=1'}); 5 const json = await response.json(); 6 7 console.log(json); 8})();
1const fetch = require('node-fetch'); 2 3(async () => { 4 const body = {a: 1}; 5 6 const response = await fetch('https://httpbin.org/post', { 7 method: 'post', 8 body: JSON.stringify(body), 9 headers: {'Content-Type': 'application/json'} 10 }); 11 const json = await response.json(); 12 13 console.log(json); 14})();
URLSearchParams
is available on the global object in Node.js as of v10.0.0. See official documentation for more usage methods.
NOTE: The Content-Type
header is only set automatically to x-www-form-urlencoded
when an instance of URLSearchParams
is given as such:
1const fetch = require('node-fetch'); 2 3const params = new URLSearchParams(); 4params.append('a', 1); 5 6(async () => { 7 const response = await fetch('https://httpbin.org/post', {method: 'POST', body: params}); 8 const json = await response.json(); 9 10 console.log(json); 11})();
NOTE: 3xx-5xx responses are NOT exceptions, and should be handled in then()
, see the next section.
Wrapping the fetch function into a try/catch
block will catch all exceptions, such as errors originating from node core libraries, like network errors, and operational errors which are instances of FetchError. See the error handling document for more details.
1const fetch = require('node-fetch'); 2 3try { 4 fetch('https://domain.invalid/'); 5} catch (error) { 6 console.log(error); 7}
It is common to create a helper function to check that the response contains no client (4xx) or server (5xx) error responses:
1const fetch = require('node-fetch'); 2 3const checkStatus = res => { 4 if (res.ok) { 5 // res.status >= 200 && res.status < 300 6 return res; 7 } else { 8 throw MyCustomError(res.statusText); 9 } 10} 11 12(async () => { 13 const response = await fetch('https://httpbin.org/status/400'); 14 const data = checkStatus(response); 15 16 console.log(data); //=> MyCustomError 17})();
Cookies are not stored by default. However, cookies can be extracted and passed by manipulating request and response headers. See Extract Set-Cookie Header for details.
The "Node.js way" is to use streams when possible. You can pipe res.body
to another stream. This example uses stream.pipeline to attach stream error handlers and wait for the download to complete.
1const util = require('util'); 2const fs = require('fs'); 3const streamPipeline = util.promisify(require('stream').pipeline); 4 5(async () => { 6 const response = await fetch('https://assets-cdn.github.com/images/modules/logos_page/Octocat.png'); 7 8 if (response.ok) { 9 return streamPipeline(response.body, fs.createWriteStream('./octocat.png')); 10 } 11 12 throw new Error(`unexpected response ${response.statusText}`); 13})();
If you prefer to cache binary data in full, use buffer(). (NOTE: buffer() is a node-fetch
only API)
1const fetch = require('node-fetch'); 2const fileType = require('file-type'); 3 4(async () => { 5 const response = await fetch('https://octodex.github.com/images/Fintechtocat.png'); 6 const buffer = await response.buffer(); 7 const type = fileType.fromBuffer(buffer) 8 9 console.log(type); 10})();
1const fetch = require('node-fetch'); 2 3(async () => { 4 const response = await fetch('https://github.com/'); 5 6 console.log(response.ok); 7 console.log(response.status); 8 console.log(response.statusText); 9 console.log(response.headers.raw()); 10 console.log(response.headers.get('content-type')); 11})();
Unlike browsers, you can access raw Set-Cookie
headers manually using Headers.raw()
. This is a node-fetch
only API.
1const fetch = require('node-fetch'); 2 3(async () => { 4 const response = await fetch('https://example.com'); 5 6 // Returns an array of values, instead of a string of comma-separated values 7 console.log(response.headers.raw()['set-cookie']); 8})();
1const {createReadStream} = require('fs'); 2const fetch = require('node-fetch'); 3 4const stream = createReadStream('input.txt'); 5 6(async () => { 7 const response = await fetch('https://httpbin.org/post', {method: 'POST', body: stream}); 8 const json = await response.json(); 9 10 console.log(json) 11})();
1const fetch = require('node-fetch'); 2const FormData = require('form-data'); 3 4const form = new FormData(); 5form.append('a', 1); 6 7(async () => { 8 const response = await fetch('https://httpbin.org/post', {method: 'POST', body: form}); 9 const json = await response.json(); 10 11 console.log(json) 12})(); 13 14// OR, using custom headers 15// NOTE: getHeaders() is non-standard API 16 17const options = { 18 method: 'POST', 19 body: form, 20 headers: form.getHeaders() 21}; 22 23(async () => { 24 const response = await fetch('https://httpbin.org/post', options); 25 const json = await response.json(); 26 27 console.log(json) 28})();
node-fetch also supports spec-compliant FormData implementations such as formdata-node:
1const fetch = require('node-fetch'); 2const FormData = require('formdata-node'); 3 4const form = new FormData(); 5form.set('greeting', 'Hello, world!'); 6 7fetch('https://httpbin.org/post', {method: 'POST', body: form}) 8 .then(res => res.json()) 9 .then(json => console.log(json));
You may cancel requests with AbortController
. A suggested implementation is abort-controller
.
An example of timing out a request after 150ms could be achieved as the following:
1const fetch = require('node-fetch'); 2const AbortController = require('abort-controller'); 3 4const controller = new AbortController(); 5const timeout = setTimeout(() => { 6 controller.abort(); 7}, 150); 8 9(async () => { 10 try { 11 const response = await fetch('https://example.com', {signal: controller.signal}); 12 const data = await response.json(); 13 14 useData(data); 15 } catch (error) { 16 if (error.name === 'AbortError') { 17 console.log('request was aborted'); 18 } 19 } finally { 20 clearTimeout(timeout); 21 } 22})();
See test cases for more examples.
url
A string representing the URL for fetchingoptions
Options for the HTTP(S) requestPromise<Response>
Perform an HTTP(S) fetch.
url
should be an absolute url, such as https://example.com/
. A path-relative URL (/file/under/root
) or protocol-relative URL (//can-be-http-or-https.com/
) will result in a rejected Promise
.
The default values are shown after each option key.
1{ 2 // These properties are part of the Fetch Standard 3 method: 'GET', 4 headers: {}, // Request headers. format is the identical to that accepted by the Headers constructor (see below) 5 body: null, // Request body. can be null, a string, a Buffer, a Blob, or a Node.js Readable stream 6 redirect: 'follow', // Set to `manual` to extract redirect headers, `error` to reject redirect 7 signal: null, // Pass an instance of AbortSignal to optionally abort requests 8 9 // The following properties are node-fetch extensions 10 follow: 20, // maximum redirect count. 0 to not follow redirect 11 compress: true, // support gzip/deflate content encoding. false to disable 12 size: 0, // maximum response body size in bytes. 0 to disable 13 agent: null, // http(s).Agent instance or function that returns an instance (see below) 14 highWaterMark: 16384, // the maximum number of bytes to store in the internal buffer before ceasing to read from the underlying resource. 15 insecureHTTPParser: false // Use an insecure HTTP parser that accepts invalid HTTP headers when `true`. 16}
If no values are set, the following request headers will be sent automatically:
Header | Value |
---|---|
Accept-Encoding | gzip,deflate,br (when options.compress === true ) |
Accept | */* |
Connection | close (when no options.agent is present) |
Content-Length | (automatically calculated, if possible) |
Transfer-Encoding | chunked (when req.body is a stream) |
User-Agent | node-fetch |
Note: when body
is a Stream
, Content-Length
is not set automatically.
The agent
option allows you to specify networking related options which are out of the scope of Fetch, including and not limited to the following:
See http.Agent
for more information.
In addition, the agent
option accepts a function that returns http
(s).Agent
instance given current URL, this is useful during a redirection chain across HTTP and HTTPS protocol.
1const http = require('http'); 2const https = require('https'); 3 4const httpAgent = new http.Agent({ 5 keepAlive: true 6}); 7const httpsAgent = new https.Agent({ 8 keepAlive: true 9}); 10 11const options = { 12 agent: function(_parsedURL) { 13 if (_parsedURL.protocol == 'http:') { 14 return httpAgent; 15 } else { 16 return httpsAgent; 17 } 18 } 19};
Stream on Node.js have a smaller internal buffer size (16kB, aka highWaterMark
) from client-side browsers (>1MB, not consistent across browsers). Because of that, when you are writing an isomorphic app and using res.clone()
, it will hang with large response in Node.
The recommended way to fix this problem is to resolve cloned response in parallel:
1const fetch = require('node-fetch'); 2 3(async () => { 4 const response = await fetch('https://example.com'); 5 const r1 = await response.clone(); 6 7 return Promise.all([res.json(), r1.text()]).then(results => { 8 console.log(results[0]); 9 console.log(results[1]); 10 }); 11})();
If for some reason you don't like the solution above, since 3.x
you are able to modify the highWaterMark
option:
1const fetch = require('node-fetch'); 2 3(async () => { 4 const response = await fetch('https://example.com', { 5 // About 1MB 6 highWaterMark: 1024 * 1024 7 }); 8 9 return res.clone().buffer(); 10})();
Passed through to the insecureHTTPParser
option on http(s).request. See http.request
for more information.
An HTTP(S) request containing information about URL, method, headers, and the body. This class implements the Body interface.
Due to the nature of Node.js, the following properties are not implemented at this moment:
type
destination
referrer
referrerPolicy
mode
credentials
cache
integrity
keepalive
The following node-fetch extension properties are provided:
follow
compress
counter
agent
highWaterMark
See options for exact meaning of these extensions.
(spec-compliant)
input
A string representing a URL, or another Request
(which will be cloned)options
[Options][#fetch-options] for the HTTP(S) requestConstructs a new Request
object. The constructor is identical to that in the browser.
In most cases, directly fetch(url, options)
is simpler than creating a Request
object.
An HTTP(S) response. This class implements the Body interface.
The following properties are not implemented in node-fetch at this moment:
Response.error()
Response.redirect()
type
trailer
(spec-compliant)
body
A String
or Readable
streamoptions
A ResponseInit
options dictionaryConstructs a new Response
object. The constructor is identical to that in the browser.
Because Node.js does not implement service workers (for which this class was designed), one rarely has to construct a Response
directly.
(spec-compliant)
Convenience property representing if the request ended normally. Will evaluate to true if the response status was greater than or equal to 200 but smaller than 300.
(spec-compliant)
Convenience property representing if the request has been redirected at least once. Will evaluate to true if the internal redirect counter is greater than 0.
This class allows manipulating and iterating over a set of HTTP headers. All methods specified in the Fetch Standard are implemented.
(spec-compliant)
init
Optional argument to pre-fill the Headers
objectConstruct a new Headers
object. init
can be either null
, a Headers
object, an key-value map object or any iterable object.
1// Example adapted from https://fetch.spec.whatwg.org/#example-headers-class 2const { Headers } = require('node-fetch'); 3 4const meta = { 5 'Content-Type': 'text/xml', 6 'Breaking-Bad': '<3' 7}; 8const headers = new Headers(meta); 9 10// The above is equivalent to 11const meta = [['Content-Type', 'text/xml'], ['Breaking-Bad', '<3']]; 12const headers = new Headers(meta); 13 14// You can in fact use any iterable objects, like a Map or even another Headers 15const meta = new Map(); 16meta.set('Content-Type', 'text/xml'); 17meta.set('Breaking-Bad', '<3'); 18const headers = new Headers(meta); 19const copyOfHeaders = new Headers(headers);
Body
is an abstract interface with methods that are applicable to both Request
and Response
classes.
The following methods are not yet implemented in node-fetch at this moment:
formData()
(deviation from spec)
Readable
streamData are encapsulated in the Body
object. Note that while the Fetch Standard requires the property to always be a WHATWG ReadableStream
, in node-fetch it is a Node.js Readable
stream.
(spec-compliant)
Boolean
A boolean property for if this body has been consumed. Per the specs, a consumed body cannot be used again.
(spec-compliant)
Promise
Consume the body and return a promise that will resolve to one of these formats.
(node-fetch extension)
Promise<Buffer>
Consume the body and return a promise that will resolve to a Buffer.
(node-fetch extension)
An operational error in the fetching process. See ERROR-HANDLING.md for more info.
(node-fetch extension)
An Error thrown when the request is aborted in response to an AbortSignal
's abort
event. It has a name
property of AbortError
. See ERROR-HANDLING.MD for more info.
Since 3.x
types are bundled with node-fetch
, so you don't need to install any additional packages.
For older versions please use the type definitions from DefinitelyTyped:
1$ npm install --save-dev @types/node-fetch
Thanks to github/fetch for providing a solid implementation reference.
David Frank | Jimmy Wärting | Antoni Kepinski | Richie Bendall | Gregor Martynus |
No vulnerabilities found.
No security vulnerabilities found.