Gathering detailed insights and metrics for @fastify/multipart
Gathering detailed insights and metrics for @fastify/multipart
Gathering detailed insights and metrics for @fastify/multipart
Gathering detailed insights and metrics for @fastify/multipart
npm install @fastify/multipart
98.3
Supply Chain
99.6
Quality
87.3
Maintenance
100
Vulnerability
100
License
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
489 Stars
404 Commits
103 Forks
17 Watching
4 Branches
81 Contributors
Updated on 22 Nov 2024
Minified
Minified + Gzipped
JavaScript (94.71%)
TypeScript (4.91%)
HTML (0.38%)
Cumulative downloads
Total Downloads
Last day
17.2%
50,782
Compared to previous day
Last week
12%
239,525
Compared to previous week
Last month
16.8%
903,527
Compared to previous month
Last year
100.6%
7,398,431
Compared to previous year
Fastify plugin to parse the multipart content-type. Supports:
Under the hood it uses @fastify/busboy
.
1npm i @fastify/multipart
1const fastify = require('fastify')() 2const fs = require('node:fs') 3const { pipeline } = require('node:stream/promises') 4 5fastify.register(require('@fastify/multipart')) 6 7fastify.post('/', async function (req, reply) { 8 // process a single file 9 // also, consider that if you allow to upload multiple files 10 // you must consume all files otherwise the promise will never fulfill 11 const data = await req.file() 12 13 data.file // stream 14 data.fields // other parsed parts 15 data.fieldname 16 data.filename 17 data.encoding 18 data.mimetype 19 20 // to accumulate the file in memory! Be careful! 21 // 22 // await data.toBuffer() // Buffer 23 // 24 // or 25 26 await pipeline(data.file, fs.createWriteStream(data.filename)) 27 28 // be careful of permission issues on disk and not overwrite 29 // sensitive files that could cause security risks 30 31 // also, consider that if the file stream is not consumed, the promise will never fulfill 32 33 reply.send() 34}) 35 36fastify.listen({ port: 3000 }, err => { 37 if (err) throw err 38 console.log(`server listening on ${fastify.server.address().port}`) 39})
Note about data.fields
: busboy
consumes the multipart in serial order (stream). Therefore, the order of form fields is VERY IMPORTANT to how @fastify/multipart
can display the fields to you.
We would recommend you place the value fields first before any of the file fields.
It will ensure your fields are accessible before it starts consuming any files.
If you cannot control the order of the placed fields, be sure to read data.fields
AFTER consuming the stream, or it will only contain the fields parsed at that moment.
You can also pass optional arguments to @fastify/busboy
when registering with Fastify. This is useful for setting limits on the content that can be uploaded. A full list of available options can be found in the @fastify/busboy
documentation.
1fastify.register(require('@fastify/multipart'), { 2 limits: { 3 fieldNameSize: 100, // Max field name size in bytes 4 fieldSize: 100, // Max field value size in bytes 5 fields: 10, // Max number of non-file fields 6 fileSize: 1000000, // For multipart forms, the max file size in bytes 7 files: 1, // Max number of file fields 8 headerPairs: 2000, // Max number of header key=>value pairs 9 parts: 1000 // For multipart forms, the max number of parts (fields + files) 10 } 11});
For security reasons, @fastify/multipart
sets the limit for parts
and fileSize
being 1000 and 1048576 respectively.
Note: if the file stream that is provided by data.file
is not consumed, like in the example below with the usage of pipeline, the promise will not be fulfilled at the end of the multipart processing.
This behavior is inherited from @fastify/busboy
.
Note: if you set a fileSize
limit and you want to know if the file limit was reached you can:
data.file.on('limit')
data.file.truncated
data.file.toBuffer()
and wait for the error to be thrown1const data = await req.file() 2await pipeline(data.file, fs.createWriteStream(data.filename)) 3if (data.file.truncated) { 4 // you may need to delete the part of the file that has been saved on disk 5 // before the `limits.fileSize` has been reached 6 reply.send(new fastify.multipartErrors.FilesLimitError()); 7} 8 9// OR 10const data = await req.file() 11try { 12 const buffer = await data.toBuffer() 13} catch (err) { 14 // fileSize limit reached! 15} 16
Additionally, you can pass per-request options to the req.file
, req.files
, req.saveRequestFiles
or req.parts
function.
1fastify.post('/', async function (req, reply) { 2 const options = { limits: { fileSize: 1000 } }; 3 const data = await req.file(options) 4 await pipeline(data.file, fs.createWriteStream(data.filename)) 5 reply.send() 6})
1fastify.post('/', async function (req, reply) { 2 const parts = req.files() 3 for await (const part of parts) { 4 await pipeline(part.file, fs.createWriteStream(part.filename)) 5 } 6 reply.send() 7})
1fastify.post('/upload/raw/any', async function (req, reply) { 2 const parts = req.parts() 3 for await (const part of parts) { 4 if (part.type === 'file') { 5 await pipeline(part.file, fs.createWriteStream(part.filename)) 6 } else { 7 // part.type === 'field 8 console.log(part) 9 } 10 } 11 reply.send() 12})
1fastify.post('/upload/raw/any', async function (req, reply) { 2 const data = await req.file() 3 const buffer = await data.toBuffer() 4 // upload to S3 5 reply.send() 6})
This will store all files in the operating system default directory for temporary files. As soon as the response ends all files are removed.
1fastify.post('/upload/files', async function (req, reply) { 2 // stores files to tmp dir and return files 3 const files = await req.saveRequestFiles() 4 files[0].type // "file" 5 files[0].filepath 6 files[0].fieldname 7 files[0].filename 8 files[0].encoding 9 files[0].mimetype 10 files[0].fields // other parsed parts 11 12 reply.send() 13})
If you set a fileSize
limit, it is able to throw a RequestFileTooLargeError
error when limit reached.
1fastify.post('/upload/files', async function (req, reply) { 2 try { 3 const file = await req.file({ limits: { fileSize: 17000 } }) 4 //const files = req.files({ limits: { fileSize: 17000 } }) 5 //const parts = req.parts({ limits: { fileSize: 17000 } }) 6 //const files = await req.saveRequestFiles({ limits: { fileSize: 17000 } }) 7 reply.send() 8 } catch (error) { 9 // error instanceof fastify.multipartErrors.RequestFileTooLargeError 10 } 11})
If you want to fallback to the handling before 4.0.0
, you can disable the throwing behavior by passing throwFileSizeLimit
.
Note: It will not affect the behavior of saveRequestFiles()
1// globally disable 2fastify.register(fastifyMultipart, { throwFileSizeLimit: false }) 3 4fastify.post('/upload/file', async function (req, reply) { 5 const file = await req.file({ throwFileSizeLimit: false, limits: { fileSize: 17000 } }) 6 //const files = req.files({ throwFileSizeLimit: false, limits: { fileSize: 17000 } }) 7 //const parts = req.parts({ throwFileSizeLimit: false, limits: { fileSize: 17000 } }) 8 //const files = await req.saveRequestFiles({ throwFileSizeLimit: false, limits: { fileSize: 17000 } }) 9 reply.send() 10})
This allows you to parse all fields automatically and assign them to the request.body
. By default files are accumulated in memory (Be careful!) to buffer objects. Uncaught errors are handled by Fastify.
1fastify.register(require('@fastify/multipart'), { attachFieldsToBody: true }) 2 3fastify.post('/upload/files', async function (req, reply) { 4 const uploadValue = await req.body.upload.toBuffer() // access files 5 const fooValue = req.body.foo.value // other fields 6 const body = Object.fromEntries( 7 Object.keys(req.body).map((key) => [key, req.body[key].value]) 8 ) // Request body in key-value pairs, like req.body in Express (Node 12+) 9 10 // On Node 18+ 11 const formData = await req.formData() 12 console.log(formData) 13})
Request body key-value pairs can be assigned directly using attachFieldsToBody: 'keyValues'
. Field values, including file buffers, will be attached to the body object.
1fastify.register(require('@fastify/multipart'), { attachFieldsToBody: 'keyValues' }) 2 3fastify.post('/upload/files', async function (req, reply) { 4 const uploadValue = req.body.upload // access file as buffer 5 const fooValue = req.body.foo // other fields 6})
You can also define an onFile
handler to avoid accumulating all files in memory.
1async function onFile(part) { 2 // you have access to original request via `this` 3 console.log(this.id) 4 await pipeline(part.file, fs.createWriteStream(part.filename)) 5} 6 7fastify.register(require('@fastify/multipart'), { attachFieldsToBody: true, onFile }) 8 9fastify.post('/upload/files', async function (req, reply) { 10 const fooValue = req.body.foo.value // other fields 11})
The onFile
handler can also be used with attachFieldsToBody: 'keyValues'
in order to specify how file buffer values are decoded.
1async function onFile(part) { 2 const buff = await part.toBuffer() 3 const decoded = Buffer.from(buff.toString(), 'base64').toString() 4 part.value = decoded // set `part.value` to specify the request body value 5} 6 7fastify.register(require('@fastify/multipart'), { attachFieldsToBody: 'keyValues', onFile }) 8 9fastify.post('/upload/files', async function (req, reply) { 10 const uploadValue = req.body.upload // access file as base64 string 11 const fooValue = req.body.foo // other fields 12})
Note: if you assign all fields to the body and don't define an onFile
handler, you won't be able to read the files through streams, as they are already read and their contents are accumulated in memory.
You can only use the toBuffer
method to read the content.
If you try to read from a stream and pipe to a new file, you will obtain an empty new file.
When the attachFieldsToBody
parameter is set to 'keyValues'
, JSON Schema validation on the body will behave similarly to application/json
and application/x-www-form-urlencoded
content types. Additionally, uploaded files will be attached to the body as Buffer
objects.
1fastify.register(require('@fastify/multipart'), { attachFieldsToBody: 'keyValues' }) 2 3fastify.post('/upload/files', { 4 schema: { 5 consumes: ['multipart/form-data'], 6 body: { 7 type: 'object', 8 required: ['myFile'], 9 properties: { 10 // file that gets decoded to string 11 myFile: { 12 type: 'object', 13 }, 14 hello: { 15 type: 'string', 16 enum: ['world'] 17 } 18 } 19 } 20 } 21}, function (req, reply) { 22 console.log({ body: req.body }) 23 reply.send('done') 24})
If you enable attachFieldsToBody: true
and set sharedSchemaId
a shared JSON Schema is added, which can be used to validate parsed multipart fields.
1const opts = { 2 attachFieldsToBody: true, 3 sharedSchemaId: '#mySharedSchema' 4} 5fastify.register(require('@fastify/multipart'), opts) 6 7fastify.post('/upload/files', { 8 schema: { 9 consumes: ['multipart/form-data'], 10 body: { 11 type: 'object', 12 required: ['myField'], 13 properties: { 14 // field that uses the shared schema 15 myField: { $ref: '#mySharedSchema'}, 16 // or another field that uses the shared schema 17 myFiles: { type: 'array', items: fastify.getSchema('mySharedSchema') }, 18 // or a field that doesn't use the shared schema 19 hello: { 20 properties: { 21 value: { 22 type: 'string', 23 enum: ['male'] 24 } 25 } 26 } 27 } 28 } 29 } 30}, function (req, reply) { 31 console.log({ body: req.body }) 32 reply.send('done') 33})
If provided, the sharedSchemaId
parameter must be a string ID and a shared schema will be added to your fastify instance so you will be able to apply the validation to your service (like in the example mentioned above).
The shared schema, that is added, will look like this:
1{ 2 type: 'object', 3 properties: { 4 encoding: { type: 'string' }, 5 filename: { type: 'string' }, 6 limit: { type: 'boolean' }, 7 mimetype: { type: 'string' } 8 } 9}
If you want to use @fastify/multipart
with @fastify/swagger
and @fastify/swagger-ui
you must add a new type called isFile
and use custom instance of validator compiler Docs.
1 2const fastify = require('fastify')({ 3 // ... 4 ajv: { 5 // Adds the file plugin to help @fastify/swagger schema generation 6 plugins: [require('@fastify/multipart').ajvFilePlugin] 7 } 8}) 9 10fastify.register(require("@fastify/multipart"), { 11 attachFieldsToBody: true, 12}); 13 14fastify.post( 15 "/upload/files", 16 { 17 schema: { 18 consumes: ["multipart/form-data"], 19 body: { 20 type: "object", 21 required: ["myField"], 22 properties: { 23 myField: { isFile: true }, 24 }, 25 }, 26 }, 27 }, 28 function (req, reply) { 29 console.log({ body: req.body }); 30 reply.send("done"); 31 } 32); 33
When sending fields with the body (attachFieldsToBody
set to true), the field might look like this in the request.body
:
1{ 2 "hello": "world" 3}
The mentioned field will be converted, by this plugin, to a more complex field. The converted field will look something like this:
1{ 2 hello: { 3 fieldname: "hello", 4 value: "world", 5 fieldnameTruncated: false, 6 valueTruncated: false, 7 fields: body 8 } 9}
It is important to know that this conversion happens BEFORE the field is validated, so keep that in mind when writing the JSON schema for validation for fields that don't use the shared schema. The schema for validation for the field mentioned above should look like this:
1hello: { 2 properties: { 3 value: { 4 type: 'string' 5 } 6 } 7}
If a non file field sent has Content-Type
header starting with application/json
, it will be parsed using JSON.parse
.
The schema to validate JSON fields should look like this:
1hello: { 2 properties: { 3 value: { 4 type: 'object', 5 properties: { 6 /* ... */ 7 } 8 } 9 } 10}
If you also use the shared JSON schema as shown above, this is a full example which validates the entire field:
1const opts = { 2 attachFieldsToBody: true, 3 sharedSchemaId: '#mySharedSchema' 4} 5fastify.register(require('@fastify/multipart'), opts) 6 7fastify.post('/upload/files', { 8 schema: { 9 consumes: ['multipart/form-data'], 10 body: { 11 type: 'object', 12 required: ['field'], 13 properties: { 14 field: { 15 allOf: [ 16 { $ref: '#mySharedSchema' }, 17 { 18 properties: { 19 value: { 20 type: 'object' 21 properties: { 22 child: { 23 type: 'string' 24 } 25 } 26 } 27 } 28 } 29 ] 30 } 31 } 32 } 33 } 34}, function (req, reply) { 35 console.log({ body: req.body }) 36 reply.send('done') 37})
We export all custom errors via a server decorator fastify.multipartErrors
. This is useful if you want to react to specific errors. They are derived from @fastify/error and include the correct statusCode
property.
1fastify.post('/upload/files', async function (req, reply) { 2 const { FilesLimitError } = fastify.multipartErrors 3})
This project is kindly sponsored by:
Licensed under MIT.
The latest stable version of the package.
Stable Version
2
7.5/10
Summary
Denial of service due to unlimited number of parts
Affected Versions
>= 7.0.0, < 7.4.1
Patched Versions
7.4.1
7.5/10
Summary
Denial of service due to unlimited number of parts
Affected Versions
< 6.0.1
Patched Versions
6.0.1
Reason
no dangerous workflow patterns detected
Reason
8 commit(s) and 4 issue activity found in the last 90 days -- score normalized to 10
Reason
no binaries found in the repo
Reason
0 existing vulnerabilities detected
Reason
license file detected
Details
Reason
security policy file detected
Details
Reason
SAST tool is not run on all commits -- score normalized to 8
Details
Reason
Found 9/19 approved changesets -- score normalized to 4
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
project is not fuzzed
Details
Score
Last Scanned on 2024-11-25
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More