Gathering detailed insights and metrics for @aeolun/compress
Gathering detailed insights and metrics for @aeolun/compress
Gathering detailed insights and metrics for @aeolun/compress
Gathering detailed insights and metrics for @aeolun/compress
npm install @aeolun/compress
Typescript
Module System
Node Version
NPM Version
JavaScript (97.83%)
TypeScript (2.17%)
Total Downloads
0
Last Day
0
Last Week
0
Last Month
0
Last Year
0
NOASSERTION License
214 Stars
383 Commits
50 Forks
20 Watchers
4 Branches
60 Contributors
Updated on Jul 11, 2025
Latest Version
8.0.6
Package Id
@aeolun/compress@8.0.6
Unpacked Size
187.57 kB
Size
25.12 kB
File Count
19
NPM Version
10.9.2
Node Version
22.16.0
Published on
Jun 19, 2025
Cumulative downloads
Total Downloads
Last Day
0%
NaN
Compared to previous day
Last Week
0%
NaN
Compared to previous week
Last Month
0%
NaN
Compared to previous month
Last Year
0%
NaN
Compared to previous year
Adds compression utils to the Fastify reply
object and a hook to decompress requests payloads.
Supports gzip
, deflate
, and brotli
.
ℹ️ Note: In large-scale scenarios, use a proxy like Nginx to handle response compression.
⚠ Warning: Since
@fastify/compress
version 4.x, payloads compressed with thezip
algorithm are not automatically uncompressed. This plugin focuses on response compression, andzip
is not in the IANA Table of Content Encodings.
npm i @fastify/compress
Plugin version | Fastify version |
---|---|
>=8.x | ^5.x |
>=6.x <8.x | ^4.x |
>=3.x <6.x | ^3.x |
^2.x | ^2.x |
>=0.x <2.x | ^1.x |
Please note that if a Fastify version is out of support, then so are the corresponding versions of this plugin in the table above. See Fastify's LTS policy for more details.
This plugin adds two functionalities to Fastify: a compress utility and a global compression hook.
Currently, the following encoding tokens are supported, using the first acceptable token in this order:
br
gzip
deflate
*
(no preference — @fastify/compress
will use gzip
)identity
(no compression)If an unsupported encoding is received or the 'accept-encoding'
header is missing, the payload will not be compressed.
To return an error for unsupported encoding, use the onUnsupportedEncoding
option.
The plugin compresses payloads based on content-type
. If absent, it assumes application/json
.
The plugin supports compressing the following payload types:
The global compression hook is enabled by default. To disable it, pass { global: false }
:
1await fastify.register( 2 import('@fastify/compress'), 3 { global: false } 4)
If only compression or decompression is required, set the globalCompression
or globalDecompression
config flags to false
respectively (both are true
by default).
1await fastify.register( 2 import('@fastify/compress'), 3 // only decompress compressed incoming requests 4 { globalCompression: false } 5)
Fastify encapsulation can be used to set global compression but run it only in a subset of routes by wrapping them inside a plugin.
ℹ️ Note: If using
@fastify/compress
plugin together with@fastify/static
plugin,@fastify/compress
must be registered (with global hook) before registering@fastify/static
.
Different compression options can be specified per route using the compress
options in the route's configuration.
Setting compress: false
on any route will disable compression on the route even if global compression is enabled.
1await fastify.register( 2 import('@fastify/compress'), 3 { global: false } 4) 5 6// only compress if the payload is above a certain size and use brotli 7fastify.get('/custom-route', { 8 compress: { 9 inflateIfDeflated: true, 10 threshold: 128, 11 zlib: { 12 createBrotliCompress: () => createYourCustomBrotliCompress(), 13 createGzip: () => createYourCustomGzip(), 14 createDeflate: () => createYourCustomDeflate() 15 } 16 }, (req, reply) => { 17 // ... 18 })
reply.compress
This plugin adds a compress
method to reply
that compresses a stream or string based on the accept-encoding
header. If a JS object is passed, it will be stringified to JSON.
ℹ️ Note: When compressing a Response object, the compress middleware only extracts and compresses the body stream. It will handle compression-related headers (like
Content-Encoding
andVary
) but does not copy other headers or status from the Response object - these remain the responsibility of your application or Fastify's built-in handling.
The compress
method uses per-route parameters if configured, otherwise it uses global parameters.
1import fs from 'node:fs' 2import fastify from 'fastify' 3 4const app = fastify() 5await app.register(import('@fastify/compress'), { global: false }) 6 7// Compress a file stream 8app.get('/file', (req, reply) => { 9 reply 10 .type('text/plain') 11 .compress(fs.createReadStream('./package.json')) 12}) 13 14// Compress a Response object from fetch 15app.get('/fetch', async (req, reply) => { 16 const response = await fetch('https://api.example.com/data') 17 reply 18 .type('application/json') 19 .compress(response) 20}) 21 22// Compress a ReadableStream 23app.get('/stream', (req, reply) => { 24 const response = new Response('Hello World') 25 reply 26 .type('text/plain') 27 .compress(response.body) 28}) 29 30await app.listen({ port: 3000 })
The minimum byte size for response compression. Defaults to 1024
.
ℹ️ Note: The threshold setting only applies to string and Buffer payloads. Streams (including Node.js streams, Response objects, and ReadableStream objects) are always compressed regardless of the threshold, as their size cannot be determined in advance.
1await fastify.register( 2 import('@fastify/compress'), 3 { threshold: 2048 } 4)
mime-db determines if a content-type
should be compressed. Additional content types can be compressed via regex or a function.
1await fastify.register( 2 import('@fastify/compress'), 3 { customTypes: /x-protobuf$/ } 4)
or
1await fastify.register( 2 import('@fastify/compress'), 3 { customTypes: contentType => contentType.endsWith('x-protobuf') } 4)
Set onUnsupportedEncoding(encoding, request, reply)
to send a custom error response for unsupported encoding. The function can modify the reply and return a string | Buffer | Stream | Error
payload.
1await fastify.register( 2 import('@fastify/compress'), 3 { 4 onUnsupportedEncoding: (encoding, request, reply) => { 5 reply.code(406) 6 return 'We do not support the ' + encoding + ' encoding.' 7 } 8 } 9)
Response compression can be disabled by an x-no-compression
header in the request.
Optional feature to inflate pre-compressed data if the client does not include one of the supported compression types in its accept-encoding
header.
1await fastify.register( 2 import('@fastify/compress'), 3 { inflateIfDeflated: true } 4) 5 6fastify.get('/file', (req, reply) => 7 // will inflate the file on the way out for clients 8 // that indicate they do not support compression 9 reply.send(fs.createReadStream('./file.gz')))
By default, @fastify/compress
prioritizes compression as described here. Change this by passing an array of compression tokens to the encodings
option:
1await fastify.register( 2 import('@fastify/compress'), 3 // Only support gzip and deflate, and prefer deflate to gzip 4 { encodings: ['deflate', 'gzip'] } 5)
Compression can be tuned with brotliOptions
and zlibOptions
, which are passed directly to native node zlib
methods. See class definitions.
1 server.register(fastifyCompress, { 2 brotliOptions: { 3 params: { 4 [zlib.constants.BROTLI_PARAM_MODE]: zlib.constants.BROTLI_MODE_TEXT, // useful for APIs that primarily return text 5 [zlib.constants.BROTLI_PARAM_QUALITY]: 4, // default is 4, max is 11, min is 0 6 }, 7 }, 8 zlibOptions: { 9 level: 6, // default is typically 6, max is 9, min is 0 10 } 11 });
Content-Length
header removal with removeContentLengthHeaderBy default, @fastify/compress
removes the reply Content-Length
header. Change this by setting removeContentLengthHeader
to false
globally or per route.
1 // Global plugin scope 2 await server.register(fastifyCompress, { global: true, removeContentLengthHeader: false }); 3 4 // Route-specific scope 5 fastify.get('/file', { 6 compress: { removeContentLengthHeader: false } 7 }, (req, reply) => 8 reply.compress(fs.createReadStream('./file.gz')) 9 )
This plugin adds a preParsing
hook to decompress the request payload based on the content-encoding
request header.
Currently, the following encoding tokens are supported:
br
gzip
deflate
If an unsupported encoding or invalid payload is received, the plugin throws an error.
If the request header is missing, the plugin yields to the next hook.
The global request decompression hook is enabled by default. To disable it, pass { global: false }
:
1await fastify.register( 2 import('@fastify/compress'), 3 { global: false } 4)
Fastify encapsulation can be used to set global decompression but run it only in a subset of routes by wrapping them inside a plugin.
Specify different decompression options per route using the decompress
options in the route's configuration.
1await fastify.register( 2 import('@fastify/compress'), 3 { global: false } 4) 5 6// Always decompress using gzip 7fastify.get('/custom-route', { 8 decompress: { 9 forceRequestEncoding: 'gzip', 10 zlib: { 11 createBrotliDecompress: () => createYourCustomBrotliDecompress(), 12 createGunzip: () => createYourCustomGunzip(), 13 createInflate: () => createYourCustomInflate() 14 } 15 } 16}, (req, reply) => { 17 // ... 18 })
By default, @fastify/compress
accepts all encodings specified here. Change this by passing an array of compression tokens to the requestEncodings
option:
1await fastify.register( 2 import('@fastify/compress'), 3 // Only support gzip 4 { requestEncodings: ['gzip'] } 5)
By default, @fastify/compress
chooses the decompression algorithm based on the content-encoding
header.
One algorithm can be forced, and the header ignored, by providing the forceRequestEncoding
option.
If the request payload is not compressed, @fastify/compress
will try to decompress, resulting in an error.
The response error can be customized for unsupported request payload encoding by setting onUnsupportedEncoding(request, encoding)
to a function that returns an error.
1await fastify.register( 2 import('@fastify/compress'), 3 { 4 onUnsupportedRequestEncoding: (request, encoding) => { 5 return { 6 statusCode: 415, 7 code: 'UNSUPPORTED', 8 error: 'Unsupported Media Type', 9 message: 'We do not support the ' + encoding + ' encoding.' 10 } 11 } 12 } 13)
The response error can be customized for undetectable request payloads by setting onInvalidRequestPayload(request, encoding)
to a function that returns an error.
1await fastify.register( 2 import('@fastify/compress'), 3 { 4 onInvalidRequestPayload: (request, encoding, error) => { 5 return { 6 statusCode: 400, 7 code: 'BAD_REQUEST', 8 error: 'Bad Request', 9 message: 'This is not a valid ' + encoding + ' encoded payload: ' + error.message 10 } 11 } 12 } 13)
When @fastify/compress
receives a payload type that it doesn't natively support for compression (excluding the types listed in Supported payload types), the behavior depends on the compression method:
Using reply.compress()
: The plugin will attempt to serialize the payload using Fastify's serialize
function and then compress the result. This provides a best-effort approach to handle custom objects.
Using global compression hook: To prevent breaking applications, the plugin will pass through unsupported payload types without compression. This fail-safe approach ensures that servers continue to function even when encountering unexpected payload types.
Past sponsors:
Licensed under MIT.
No vulnerabilities found.
Reason
no dangerous workflow patterns detected
Reason
no binaries found in the repo
Reason
GitHub workflow tokens follow principle of least privilege
Details
Reason
0 existing vulnerabilities detected
Reason
security policy file detected
Details
Reason
license file detected
Details
Reason
SAST tool is not run on all commits -- score normalized to 9
Details
Reason
9 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 7
Reason
Found 3/26 approved changesets -- score normalized to 1
Reason
dependency not pinned by hash detected -- score normalized to 0
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
project is not fuzzed
Details
Score
Last Scanned on 2025-07-07
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More