Gathering detailed insights and metrics for @msgpack/msgpack
Gathering detailed insights and metrics for @msgpack/msgpack
Gathering detailed insights and metrics for @msgpack/msgpack
Gathering detailed insights and metrics for @msgpack/msgpack
@msgpack/msgpack - MessagePack for JavaScript / msgpack.org[JavaScript/TypeScript/ECMA-262]
npm install @msgpack/msgpack
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
1,297 Stars
931 Commits
162 Forks
29 Watching
3 Branches
17 Contributors
Updated on 27 Nov 2024
Minified
Minified + Gzipped
TypeScript (92.49%)
JavaScript (5.28%)
HTML (1.53%)
Makefile (0.69%)
Cumulative downloads
Total Downloads
Last day
-0.9%
58,327
Compared to previous day
Last week
4.7%
327,343
Compared to previous week
Last month
7.6%
1,343,893
Compared to previous month
Last year
98.6%
13,690,142
Compared to previous year
32
This library is an implementation of MessagePack for TypeScript and JavaScript, providing a compact and efficient binary serialization format. Learn more about MessagePack at:
This library serves as a comprehensive reference implementation of MessagePack for JavaScript with a focus on accuracy, compatibility, interoperability, and performance.
Additionally, this is also a universal JavaScript library. It is compatible not only with browsers, but with Node.js or other JavaScript engines that implement ES2015+ standards. As it is written in TypeScript, this library bundles up-to-date type definition files (d.ts
).
*Note that this is the second edition of "MessagePack for JavaScript". The first edition, which was implemented in ES5 and never released to npmjs.com, is tagged as classic
.
1import { deepStrictEqual } from "assert"; 2import { encode, decode } from "@msgpack/msgpack"; 3 4const object = { 5 nil: null, 6 integer: 1, 7 float: Math.PI, 8 string: "Hello, world!", 9 binary: Uint8Array.from([1, 2, 3]), 10 array: [10, 20, 30], 11 map: { foo: "bar" }, 12 timestampExt: new Date(), 13}; 14 15const encoded: Uint8Array = encode(object); 16 17deepStrictEqual(decode(encoded), object);
encode(data: unknown, options?: EncoderOptions): Uint8Array
decode(buffer: ArrayLike<number> | BufferSource, options?: DecoderOptions): unknown
decodeMulti(buffer: ArrayLike<number> | BufferSource, options?: DecoderOptions): Generator<unknown, void, unknown>
decodeAsync(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): Promise<unknown>
decodeArrayStream(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): AsyncIterable<unknown>
decodeMultiStream(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): AsyncIterable<unknown>
This library is published to npmjs.com
as @msgpack/msgpack.
1npm install @msgpack/msgpack
encode(data: unknown, options?: EncoderOptions): Uint8Array
It encodes data
into a single MessagePack-encoded object, and returns a byte array as Uint8Array
. It throws errors if data
is, or includes, a non-serializable object such as a function
or a symbol
.
for example:
1import { encode } from "@msgpack/msgpack"; 2 3const encoded: Uint8Array = encode({ foo: "bar" }); 4console.log(encoded);
If you'd like to convert an uint8array
to a NodeJS Buffer
, use Buffer.from(arrayBuffer, offset, length)
in order not to copy the underlying ArrayBuffer
, while Buffer.from(uint8array)
copies it:
1import { encode } from "@msgpack/msgpack"; 2 3const encoded: Uint8Array = encode({ foo: "bar" }); 4 5// `buffer` refers the same ArrayBuffer as `encoded`. 6const buffer: Buffer = Buffer.from(encoded.buffer, encoded.byteOffset, encoded.byteLength); 7console.log(buffer);
EncoderOptions
Name | Type | Default |
---|---|---|
extensionCodec | ExtensionCodec | ExtensionCodec.defaultCodec |
context | user-defined | - |
useBigInt64 | boolean | false |
maxDepth | number | 100 |
initialBufferSize | number | 2048 |
sortKeys | boolean | false |
forceFloat32 | boolean | false |
forceIntegerToFloat | boolean | false |
ignoreUndefined | boolean | false |
decode(buffer: ArrayLike<number> | BufferSource, options?: DecoderOptions): unknown
It decodes buffer
that includes a MessagePack-encoded object, and returns the decoded object typed unknown
.
buffer
must be an array of bytes, which is typically Uint8Array
or ArrayBuffer
. BufferSource
is defined as ArrayBuffer | ArrayBufferView
.
The buffer
must include a single encoded object. If the buffer
includes extra bytes after an object or the buffer
is empty, it throws RangeError
. To decode buffer
that includes multiple encoded objects, use decodeMulti()
or decodeMultiStream()
(recommended) instead.
for example:
1import { decode } from "@msgpack/msgpack"; 2 3const encoded: Uint8Array; 4const object = decode(encoded); 5console.log(object);
NodeJS Buffer
is also acceptable because it is a subclass of Uint8Array
.
DecoderOptions
Name | Type | Default |
---|---|---|
extensionCodec | ExtensionCodec | ExtensionCodec.defaultCodec |
context | user-defined | - |
useBigInt64 | boolean | false |
maxStrLength | number | 4_294_967_295 (UINT32_MAX) |
maxBinLength | number | 4_294_967_295 (UINT32_MAX) |
maxArrayLength | number | 4_294_967_295 (UINT32_MAX) |
maxMapLength | number | 4_294_967_295 (UINT32_MAX) |
maxExtLength | number | 4_294_967_295 (UINT32_MAX) |
You can use max${Type}Length
to limit the length of each type decoded.
decodeMulti(buffer: ArrayLike<number> | BufferSource, options?: DecoderOptions): Generator<unknown, void, unknown>
It decodes buffer
that includes multiple MessagePack-encoded objects, and returns decoded objects as a generator. See also decodeMultiStream()
, which is an asynchronous variant of this function.
This function is not recommended to decode a MessagePack binary via I/O stream including sockets because it's synchronous. Instead, decodeMultiStream()
decodes a binary stream asynchronously, typically spending less CPU and memory.
for example:
1import { decode } from "@msgpack/msgpack"; 2 3const encoded: Uint8Array; 4 5for (const object of decodeMulti(encoded)) { 6 console.log(object); 7}
decodeAsync(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): Promise<unknown>
It decodes stream
, where ReadableStreamLike<T>
is defined as ReadableStream<T> | AsyncIterable<T>
, in an async iterable of byte arrays, and returns decoded object as unknown
type, wrapped in Promise
.
This function works asynchronously, and might CPU resources more efficiently compared with synchronous decode()
, because it doesn't wait for the completion of downloading.
This function is designed to work with whatwg fetch()
like this:
1import { decodeAsync } from "@msgpack/msgpack"; 2 3const MSGPACK_TYPE = "application/x-msgpack"; 4 5const response = await fetch(url); 6const contentType = response.headers.get("Content-Type"); 7if (contentType && contentType.startsWith(MSGPACK_TYPE) && response.body != null) { 8 const object = await decodeAsync(response.body); 9 // do something with object 10} else { /* handle errors */ }
decodeArrayStream(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): AsyncIterable<unknown>
It is alike to decodeAsync()
, but only accepts a stream
that includes an array of items, and emits a decoded item one by one.
for example:
1import { decodeArrayStream } from "@msgpack/msgpack"; 2 3const stream: AsyncIterator<Uint8Array>; 4 5// in an async function: 6for await (const item of decodeArrayStream(stream)) { 7 console.log(item); 8}
decodeMultiStream(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): AsyncIterable<unknown>
It is alike to decodeAsync()
and decodeArrayStream()
, but the input stream
must consist of multiple MessagePack-encoded items. This is an asynchronous variant for decodeMulti()
.
In other words, it could decode an unlimited stream and emits a decoded item one by one.
for example:
1import { decodeMultiStream } from "@msgpack/msgpack"; 2 3const stream: AsyncIterator<Uint8Array>; 4 5// in an async function: 6for await (const item of decodeMultiStream(stream)) { 7 console.log(item); 8}
This function is available since v2.4.0; previously it was called as decodeStream()
.
Encoder
and Decoder
classes are provided to have better performance by reusing instances:
1import { deepStrictEqual } from "assert"; 2import { Encoder, Decoder } from "@msgpack/msgpack"; 3 4const encoder = new Encoder(); 5const decoder = new Decoder(); 6 7const encoded: Uint8Array = encoder.encode(object); 8deepStrictEqual(decoder.decode(encoded), object);
According to our benchmark, reusing Encoder
instance is about 20% faster
than encode()
function, and reusing Decoder
instance is about 2% faster
than decode()
function. Note that the result should vary in environments
and data structure.
Encoder
and Decoder
take the same options as encode()
and decode()
respectively.
To handle MessagePack Extension Types, this library provides ExtensionCodec
class.
This is an example to setup custom extension types that handles Map
and Set
classes in TypeScript:
1import { encode, decode, ExtensionCodec } from "@msgpack/msgpack"; 2 3const extensionCodec = new ExtensionCodec(); 4 5// Set<T> 6const SET_EXT_TYPE = 0 // Any in 0-127 7extensionCodec.register({ 8 type: SET_EXT_TYPE, 9 encode: (object: unknown): Uint8Array | null => { 10 if (object instanceof Set) { 11 return encode([...object], { extensionCodec }); 12 } else { 13 return null; 14 } 15 }, 16 decode: (data: Uint8Array) => { 17 const array = decode(data, { extensionCodec }) as Array<unknown>; 18 return new Set(array); 19 }, 20}); 21 22// Map<T> 23const MAP_EXT_TYPE = 1; // Any in 0-127 24extensionCodec.register({ 25 type: MAP_EXT_TYPE, 26 encode: (object: unknown): Uint8Array => { 27 if (object instanceof Map) { 28 return encode([...object], { extensionCodec }); 29 } else { 30 return null; 31 } 32 }, 33 decode: (data: Uint8Array) => { 34 const array = decode(data, { extensionCodec }) as Array<[unknown, unknown]>; 35 return new Map(array); 36 }, 37}); 38 39const encoded = encode([new Set<any>(), new Map<any, any>()], { extensionCodec }); 40const decoded = decode(encoded, { extensionCodec });
Ensure you include your extensionCodec in any recursive encode and decode statements!
Note that extension types for custom objects must be [0, 127]
, while [-1, -128]
is reserved for MessagePack itself.
When you use an extension codec, it might be necessary to have encoding/decoding state to keep track of which objects got encoded/re-created. To do this, pass a context
to the EncoderOptions
and DecoderOptions
:
1import { encode, decode, ExtensionCodec } from "@msgpack/msgpack"; 2 3class MyContext { 4 track(object: any) { /*...*/ } 5} 6 7class MyType { /* ... */ } 8 9const extensionCodec = new ExtensionCodec<MyContext>(); 10 11// MyType 12const MYTYPE_EXT_TYPE = 0 // Any in 0-127 13extensionCodec.register({ 14 type: MYTYPE_EXT_TYPE, 15 encode: (object, context) => { 16 if (object instanceof MyType) { 17 context.track(object); // <-- like this 18 return encode(object.toJSON(), { extensionCodec, context }); 19 } else { 20 return null; 21 } 22 }, 23 decode: (data, extType, context) => { 24 const decoded = decode(data, { extensionCodec, context }); 25 const my = new MyType(decoded); 26 context.track(my); // <-- and like this 27 return my; 28 }, 29}); 30 31// and later 32import { encode, decode } from "@msgpack/msgpack"; 33 34const context = new MyContext(); 35 36const encoded = = encode({myType: new MyType<any>()}, { extensionCodec, context }); 37const decoded = decode(encoded, { extensionCodec, context });
This library does not handle BigInt by default, but you have two options to handle it:
useBigInt64: true
to map bigint to MessagePack's int64/uint64ExtensionCodec
to map bigint to a MessagePack's extension typeuseBigInt64: true
is the simplest way to handle bigint, but it has limitations:
So you might want to define a custom codec to handle bigint like this:
1import { deepStrictEqual } from "assert"; 2import { encode, decode, ExtensionCodec } from "@msgpack/msgpack"; 3 4// to define a custom codec: 5const BIGINT_EXT_TYPE = 0; // Any in 0-127 6const extensionCodec = new ExtensionCodec(); 7extensionCodec.register({ 8 type: BIGINT_EXT_TYPE, 9 encode(input: unknown): Uint8Array | null { 10 if (typeof input === "bigint") { 11 if (input <= Number.MAX_SAFE_INTEGER && input >= Number.MIN_SAFE_INTEGER) { 12 return encode(Number(input)); 13 } else { 14 return encode(String(input)); 15 } 16 } else { 17 return null; 18 } 19 }, 20 decode(data: Uint8Array): bigint { 21 const val = decode(data); 22 if (!(typeof val === "string" || typeof val === "number")) { 23 throw new DecodeError(`unexpected BigInt source: ${val} (${typeof val})`); 24 } 25 return BigInt(val); 26 }, 27}); 28 29// to use it: 30const value = BigInt(Number.MAX_SAFE_INTEGER) + BigInt(1); 31const encoded: = encode(value, { extensionCodec }); 32deepStrictEqual(decode(encoded, { extensionCodec }), value);
There is a proposal for a new date/time representations in JavaScript:
This library maps Date
to the MessagePack timestamp extension by default, but you can re-map the temporal module (or Temporal Polyfill) to the timestamp extension like this:
1import { Instant } from "@std-proposal/temporal"; 2import { deepStrictEqual } from "assert"; 3import { 4 encode, 5 decode, 6 ExtensionCodec, 7 EXT_TIMESTAMP, 8 encodeTimeSpecToTimestamp, 9 decodeTimestampToTimeSpec, 10} from "@msgpack/msgpack"; 11 12// to define a custom codec 13const extensionCodec = new ExtensionCodec(); 14extensionCodec.register({ 15 type: EXT_TIMESTAMP, // override the default behavior! 16 encode(input: unknown): Uint8Array | null { 17 if (input instanceof Instant) { 18 const sec = input.seconds; 19 const nsec = Number(input.nanoseconds - BigInt(sec) * BigInt(1e9)); 20 return encodeTimeSpecToTimestamp({ sec, nsec }); 21 } else { 22 return null; 23 } 24 }, 25 decode(data: Uint8Array): Instant { 26 const timeSpec = decodeTimestampToTimeSpec(data); 27 const sec = BigInt(timeSpec.sec); 28 const nsec = BigInt(timeSpec.nsec); 29 return Instant.fromEpochNanoseconds(sec * BigInt(1e9) + nsec); 30 }, 31}); 32 33// to use it 34const instant = Instant.fromEpochMilliseconds(Date.now()); 35const encoded = encode(instant, { extensionCodec }); 36const decoded = decode(encoded, { extensionCodec }); 37deepStrictEqual(decoded, instant);
This will become default in this library with major-version increment, if the temporal module is standardized.
Blob
is a binary data container provided by browsers. To read its contents, you can use Blob#arrayBuffer()
or Blob#stream()
. Blob#stream()
is recommended if your target platform support it. This is because streaming
decode should be faster for large objects. In both ways, you need to use
asynchronous API.
1async function decodeFromBlob(blob: Blob): unknown { 2 if (blob.stream) { 3 // Blob#stream(): ReadableStream<Uint8Array> (recommended) 4 return await decodeAsync(blob.stream()); 5 } else { 6 // Blob#arrayBuffer(): Promise<ArrayBuffer> (if stream() is not available) 7 return decode(await blob.arrayBuffer()); 8 } 9}
This library is compatible with the "August 2017" revision of MessagePack specification at the point where timestamp ext was added:
The living specification is here:
https://github.com/msgpack/msgpack
Note that as of June 2019 there're no official "version" on the MessagePack specification. See https://github.com/msgpack/msgpack/issues/195 for the discussions.
The following table shows how JavaScript values are mapped to MessagePack formats and vice versa.
The mapping of integers varies on the setting of useBigInt64
.
The default, useBigInt64: false
is:
Source Value | MessagePack Format | Value Decoded |
---|---|---|
null, undefined | nil | null (*1) |
boolean (true, false) | bool family | boolean (true, false) |
number (53-bit int) | int family | number |
number (64-bit float) | float family | number |
string | str family | string |
ArrayBufferView | bin family | Uint8Array (*2) |
Array | array family | Array |
Object | map family | Object (*3) |
Date | timestamp ext family | Date (*4) |
bigint | N/A | N/A (*5) |
null
and undefined
are mapped to nil
(0xC0
) type, and are decoded into null
ArrayBufferView
s including NodeJS's Buffer
are mapped to bin
family, and are decoded into Uint8Array
Object
, it is regarded as Record<string, unknown>
in terms of TypeScriptDate
. This behavior can be overridden by registering -1
for the extension codec.useBigInt64: false
mode, but you can define an extension codec for it.If you set useBigInt64: true
, the following mapping is used:
Source Value | MessagePack Format | Value Decoded |
---|---|---|
null, undefined | nil | null |
boolean (true, false) | bool family | boolean (true, false) |
number (32-bit int) | int family | number |
number (except for the above) | float family | number |
bigint | int64 / uint64 | bigint (*6) |
string | str family | string |
ArrayBufferView | bin family | Uint8Array |
Array | array family | Array |
Object | map family | Object |
Date | timestamp ext family | Date |
This is a universal JavaScript library that supports major browsers and NodeJS.
TextEncoder
and TextDecoder
)ES2022 standard library used in this library can be polyfilled with core-js.
IE11 is no longer supported. If you'd like to use this library in IE11, use v2.x versions.
NodeJS v14 is required.
This module requires type definitions of AsyncIterator
, SourceBuffer
, whatwg streams, and so on. They are provided by "lib": ["ES2021", "DOM"]
in tsconfig.json
.
Regarding the TypeScript compiler version, only the latest TypeScript is tested in development.
Run-time performance is not the only reason to use MessagePack, but it's important to choose MessagePack libraries, so a benchmark suite is provided to monitor the performance of this library.
V8's built-in JSON has been improved for years, esp. JSON.parse()
is significantly improved in V8/7.6, it is the fastest deserializer as of 2019, as the benchmark result bellow suggests.
However, MessagePack can handles binary data effectively, actual performance depends on situations. You'd better take benchmark on your own use-case if performance matters.
Benchmark on NodeJS/v18.1.0 (V8/10.1)
operation | op | ms | op/s |
---|---|---|---|
buf = Buffer.from(JSON.stringify(obj)); | 902100 | 5000 | 180420 |
obj = JSON.parse(buf.toString("utf-8")); | 898700 | 5000 | 179740 |
buf = require("msgpack-lite").encode(obj); | 411000 | 5000 | 82200 |
obj = require("msgpack-lite").decode(buf); | 246200 | 5001 | 49230 |
buf = require("@msgpack/msgpack").encode(obj); | 843300 | 5000 | 168660 |
obj = require("@msgpack/msgpack").decode(buf); | 489300 | 5000 | 97860 |
buf = /* @msgpack/msgpack */ encoder.encode(obj); | 1154200 | 5000 | 230840 |
obj = /* @msgpack/msgpack */ decoder.decode(buf); | 448900 | 5000 | 89780 |
Note that JSON
cases use Buffer
to emulate I/O where a JavaScript string must be converted into a byte array encoded in UTF-8, whereas MessagePack modules deal with byte arrays.
The NPM package distributed in npmjs.com includes both ES2015+ and ES5 files:
dist/
is compiled into ES2019 with CommomJS, provided for NodeJS v10dist.es5+umd/
is compiled into ES5 with UMD
dist.es5+umd/msgpack.min.js
- the minified filedist.es5+umd/msgpack.js
- the non-minified filedist.es5+esm/
is compiled into ES5 with ES modules, provided for webpack-like bundlers and NodeJS's ESM-modeIf you use NodeJS and/or webpack, their module resolvers use the suitable one automatically.
This library is available via CDN:
1<script crossorigin src="https://unpkg.com/@msgpack/msgpack"></script>
It loads MessagePack
module to the global object.
You can use this module on Deno.
See example/deno-*.ts
for examples.
deno.land/x
is not supported yet.
For simple testing:
npm run test
This library uses Travis CI.
test matrix:
target=es2019
/ target=es5
See test:* in package.json and .travis.yml for details.
1# run tests on NodeJS, Chrome, and Firefox 2make test-all 3 4# edit the changelog 5code CHANGELOG.md 6 7# bump version 8npm version patch|minor|major 9 10# run the publishing task 11make publish
1npm run update-dependencies
Copyright 2019 The MessagePack community.
This software uses the ISC license:
https://opensource.org/licenses/ISC
See LICENSE for details.
No vulnerabilities found.
Reason
no dangerous workflow patterns detected
Reason
no binaries found in the repo
Reason
license file detected
Details
Reason
SAST tool detected but not run on all commits
Details
Reason
5 existing vulnerabilities detected
Details
Reason
dependency not pinned by hash detected -- score normalized to 2
Details
Reason
Found 3/17 approved changesets -- score normalized to 1
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Score
Last Scanned on 2024-11-18
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More