Gathering detailed insights and metrics for algo-msgpack-with-bigint
Gathering detailed insights and metrics for algo-msgpack-with-bigint
Gathering detailed insights and metrics for algo-msgpack-with-bigint
Gathering detailed insights and metrics for algo-msgpack-with-bigint
MessagePack for JavaScript/TypeScript/ECMA-262 / msgpack.org[JavaScript]
npm install algo-msgpack-with-bigint
Typescript
Module System
Min. Node Version
Node Version
NPM Version
TypeScript (93.9%)
JavaScript (4.97%)
Makefile (0.78%)
HTML (0.35%)
Total Downloads
0
Last Day
0
Last Week
0
Last Month
0
Last Year
0
ISC License
1 Stars
607 Commits
2 Forks
4 Branches
2 Contributors
Updated on Jan 11, 2023
Latest Version
2.1.1
Package Id
algo-msgpack-with-bigint@2.1.1
Size
104.24 kB
NPM Version
6.14.8
Node Version
14.4.0
Published on
Oct 21, 2020
Cumulative downloads
Total Downloads
Last Day
0%
NaN
Compared to previous day
Last Week
0%
NaN
Compared to previous week
Last Month
0%
NaN
Compared to previous month
Last Year
0%
NaN
Compared to previous year
34
This is a JavaScript/ECMA-262 implementation of MessagePack, an efficient binary serilization format:
This library is a universal JavaScript, meaning it is compatible with all the major browsers and NodeJS. In addition, because it is implemented in TypeScript, type definition files (d.ts
) are bundled in the distribution.
Note that this is the second version of MessagePack for JavaScript. The first version, which was implemented in ES5 and was never released to npmjs.com, is tagged as classic.
1import { deepStrictEqual } from "assert"; 2import { encode, decode } from "@msgpack/msgpack"; 3 4const object = { 5 nil: null, 6 integer: 1, 7 float: Math.PI, 8 string: "Hello, world!", 9 binary: Uint8Array.from([1, 2, 3]), 10 array: [10, 20, 30], 11 map: { foo: "bar" }, 12 timestampExt: new Date(), 13}; 14 15const encoded: Uint8Array = encode(object); 16 17deepStrictEqual(decode(encoded), object);
encode(data: unknown, options?: EncodeOptions): Uint8Array
decode(buffer: ArrayLike<number> | ArrayBuffer, options?: DecodeOptions): unknown
decodeAsync(stream: AsyncIterable<ArrayLike<number>> | ReadableStream<ArrayLike<number>>, options?: DecodeAsyncOptions): Promise<unknown>
decodeArrayStream(stream: AsyncIterable<ArrayLike<number>> | ReadableStream<ArrayLike<number>>, options?: DecodeAsyncOptions): AsyncIterable<unknown>
decodeStream(stream: AsyncIterable<ArrayLike<number>> | ReadableStream<ArrayLike<number>>, options?: DecodeAsyncOptions): AsyncIterable<unknown>
This library is published to npmjs.com
as @msgpack/msgpack.
1npm install @msgpack/msgpack
encode(data: unknown, options?: EncodeOptions): Uint8Array
It encodes data
and returns a byte array as Uint8Array
, throwing errors if data
is, or includes, a non-serializable object such as a function
or a symbol
.
for example:
1import { encode } from "@msgpack/msgpack"; 2 3const encoded: Uint8Array = encode({ foo: "bar" }); 4console.log(encoded);
If you'd like to convert the uint8array to a NodeJS Buffer
, use Buffer.from(arrayBuffer, offset, length)
in order not to copy the underlying ArrayBuffer
, while Buffer.from(uint8array)
copies it:
1import { encode } from "@msgpack/msgpack"; 2 3const encoded: Uint8Array = encode({ foo: "bar" }); 4 5// `buffer` refers the same ArrayBuffer as `encoded`. 6const buffer: Buffer = Buffer.from(encoded.buffer, encoded.byteOffset, encoded.byteLength); 7console.log(buffer);
EncodeOptions
Name | Type | Default |
---|---|---|
extensionCodec | ExtensionCodec | ExtensinCodec.defaultCodec |
maxDepth | number | 100 |
initialBufferSize | number | 2048 |
sortKeys | boolean | false |
forceFloat32 | boolean | false |
forceIntegerToFloat | boolean | false |
ignoreUndefined | boolean | false |
context | user-defined | - |
decode(buffer: ArrayLike<number> | ArrayBuffer, options?: DecodeOptions): unknown
It decodes buffer
encoded in MessagePack, and returns a decoded object as uknown
.
buffer
must be an array of bytes, which is typically Uint8Array
, or ArrayBuffer
.
for example:
1import { decode } from "@msgpack/msgpack"; 2 3const encoded: Uint8Array; 4const object = decode(encoded); 5console.log(object);
NodeJS Buffer
is also acceptable because it is a subclass of Uint8Array
.
DecodeOptions
Name | Type | Default |
---|---|---|
extensionCodec | ExtensionCodec | ExtensinCodec.defaultCodec |
maxStrLength | number | 4_294_967_295 (UINT32_MAX) |
maxBinLength | number | 4_294_967_295 (UINT32_MAX) |
maxArrayLength | number | 4_294_967_295 (UINT32_MAX) |
maxMapLength | number | 4_294_967_295 (UINT32_MAX) |
maxExtLength | number | 4_294_967_295 (UINT32_MAX) |
context | user-defined | - |
You can use max${Type}Length
to limit the length of each type decoded.
decodeAsync(stream: AsyncIterable<ArrayLike<number>> | ReadableStream<ArrayLike<number>>, options?: DecodeAsyncOptions): Promise<unknown>
It decodes stream
in an async iterable of byte arrays, and returns decoded object as uknown
type, wrapped in Promise
. This function works asyncronously.
DecodeAsyncOptions
is the same as DecodeOptions
for decode()
.
This function is designed to work with whatwg fetch()
like this:
1import { decodeAsync } from "@msgpack/msgpack"; 2 3const MSGPACK_TYPE = "application/x-msgpack"; 4 5const response = await fetch(url); 6const contentType = response.headers.get("Content-Type"); 7if (contentType && contentType.startsWith(MSGPACK_TYPE) && response.body != null) { 8 const object = await decodeAsync(response.body); 9 // do something with object 10} else { /* handle errors */ }
decodeArrayStream(stream: AsyncIterable<ArrayLike<number>> | ReadableStream<ArrayLike<number>>, options?: DecodeAsyncOptions): AsyncIterable<unknown>
It is alike to decodeAsync()
, but only accepts an array of items as the input stream
, and emits the decoded item one by one.
It throws errors when the input is not an array-family.
for example:
1import { encode } from "@msgpack/msgpack"; 2 3const stream: AsyncIterator<Uint8Array>; 4 5// in an async function: 6for await (const item of decodeArrayStream(stream)) { 7 console.log(item); 8}
decodeStream(stream: AsyncIterable<ArrayLike<number>> | ReadableStream<ArrayLike<number>>, options?: DecodeAsyncOptions): AsyncIterable<unknown>
It is alike to decodeAsync()
and decodeArrayStream()
, but the input stream
consists of independent MessagePack items.
In other words, it decodes an unlimited stream and emits an item one by one.
for example:
1import { encode } from "@msgpack/msgpack"; 2 3const stream: AsyncIterator<Uint8Array>; 4 5// in an async function: 6for await (const item of decodeStream(stream)) { 7 console.log(item); 8}
Encoder
and Decoder
classes is provided for better performance:
1import { deepStrictEqual } from "assert"; 2import { Encoder, Decoder } from "@msgpack/msgpack"; 3 4const encoder = new Encoder(); 5const decoder = new Decoder(); 6 7const encoded: Uint8Array = encoder.encode(object); 8deepStrictEqual(decoder.decode(encoded), object);
According to our benchmark, reusing Encoder
instance is about 20% faster
than encode()
function, and reusing Decoder
instance is about 2% faster
than decode()
function. Note that the result should vary in environments
and data structure.
To handle MessagePack Extension Types, this library provides ExtensionCodec
class.
Here is an example to setup custom extension types that handles Map
and Set
classes in TypeScript:
1import { encode, decode, ExtensionCodec } from "@msgpack/msgpack"; 2 3const extensionCodec = new ExtensionCodec(); 4 5// Set<T> 6const SET_EXT_TYPE = 0 // Any in 0-127 7extensionCodec.register({ 8 type: SET_EXT_TYPE, 9 encode: (object: unknown): Uint8Array | null => { 10 if (object instanceof Set) { 11 return encode([...object]); 12 } else { 13 return null; 14 } 15 }, 16 decode: (data: Uint8Array) => { 17 const array = decode(data) as Array<unknown>; 18 return new Set(array); 19 }, 20}); 21 22// Map<T> 23const MAP_EXT_TYPE = 1; // Any in 0-127 24extensionCodec.register({ 25 type: MAP_EXT_TYPE, 26 encode: (object: unknown): Uint8Array => { 27 if (object instanceof Map) { 28 return encode([...object]); 29 } else { 30 return null; 31 } 32 }, 33 decode: (data: Uint8Array) => { 34 const array = decode(data) as Array<[unknown, unknown]>; 35 return new Map(array); 36 }, 37}); 38 39// and later 40import { encode, decode } from "@msgpack/msgpack"; 41 42const encoded = = encode([new Set<any>(), new Map<any, any>()], { extensionCodec }); 43const decoded = decode(encoded, { extensionCodec });
Not that extension types for custom objects must be [0, 127]
, while [-1, -128]
is reserved for MessagePack itself.
When using an extension codec, it may be necessary to keep encoding/decoding state, to keep track of which objects got encoded/re-created. To do this, pass a context
to the EncodeOptions
and DecodeOptions
(and if using typescript, type the ExtensionCodec
too). Don't forget to pass the {extensionCodec, context}
along recursive encoding/decoding:
1import { encode, decode, ExtensionCodec } from "@msgpack/msgpack"; 2 3class MyContext { 4 track(object: any) { /*...*/ } 5} 6 7class MyType { /* ... */ } 8 9const extensionCodec = new ExtensionCodec<MyContext>(); 10 11// MyType 12const MYTYPE_EXT_TYPE = 0 // Any in 0-127 13extensionCodec.register({ 14 type: MYTYPE_EXT_TYPE, 15 encode: (object, context) => { 16 if (object instanceof MyType) { 17 context.track(object); // <-- like this 18 return encode(object.toJSON(), { extensionCodec, context }); 19 } else { 20 return null; 21 } 22 }, 23 decode: (data, extType, context) => { 24 const decoded = decode(data, { extensionCodec, context }); 25 const my = new MyType(decoded); 26 context.track(my); // <-- and like this 27 return my; 28 }, 29}); 30 31// and later 32import { encode, decode } from "@msgpack/msgpack"; 33 34const context = new MyContext(); 35 36const encoded = = encode({myType: new MyType<any>()}, { extensionCodec, context }); 37const decoded = decode(encoded, { extensionCodec, context });
This library does not handle BigInt by default, but you can handle it with ExtensionCodec
like this:
1import { deepStrictEqual } from "assert"; 2import { encode, decode, ExtensionCodec } from "@msgpack/msgpack"; 3 4const BIGINT_EXT_TYPE = 0; // Any in 0-127 5const extensionCodec = new ExtensionCodec(); 6extensionCodec.register({ 7 type: BIGINT_EXT_TYPE, 8 encode: (input: unknown) => { 9 if (typeof input === "bigint") { 10 return encode(input.toString()); 11 } else { 12 return null; 13 } 14 }, 15 decode: (data: Uint8Array) => { 16 return BigInt(decode(data)); 17 }, 18}); 19 20const value = BigInt(Number.MAX_SAFE_INTEGER) + BigInt(1); 21const encoded: = encode(value, { extensionCodec }); 22deepStrictEqual(decode(encoded, { extensionCodec }), value);
There is a proposal for a new date/time representations in JavaScript:
This library maps Date
to the MessagePack timestamp extension by default, but you can re-map the temporal module (or Temporal Polyfill) to the timestamp extension like this:
1import { Instant } from "@std-proposal/temporal"; 2import { deepStrictEqual } from "assert"; 3import { 4 encode, 5 decode, 6 ExtensionCodec, 7 EXT_TIMESTAMP, 8 encodeTimeSpecToTimestamp, 9 decodeTimestampToTimeSpec, 10} from "@msgpack/msgpack"; 11 12const extensionCodec = new ExtensionCodec(); 13extensionCodec.register({ 14 type: EXT_TIMESTAMP, // override the default behavior! 15 encode: (input: any) => { 16 if (input instanceof Instant) { 17 const sec = input.seconds; 18 const nsec = Number(input.nanoseconds - BigInt(sec) * BigInt(1e9)); 19 return encodeTimeSpecToTimestamp({ sec, nsec }); 20 } else { 21 return null; 22 } 23 }, 24 decode: (data: Uint8Array) => { 25 const timeSpec = decodeTimestampToTimeSpec(data); 26 const sec = BigInt(timeSpec.sec); 27 const nsec = BigInt(timeSpec.nsec); 28 return Instant.fromEpochNanoseconds(sec * BigInt(1e9) + nsec); 29 }, 30}); 31 32const instant = Instant.fromEpochMilliseconds(Date.now()); 33const encoded = encode(instant, { extensionCodec }); 34const decoded = decode(encoded, { extensionCodec }); 35deepStrictEqual(decoded, instant);
This will be default once the temporal module is standardizied, which is not a near-future, though.
Blob
is a binary data container provided by browsers. To read its contents, you can use Blob#arrayBuffer()
or Blob#stream()
. Blob#stream()
is recommended if your target platform support it. This is because streaming
decode should be faster for large objects. In both ways, you need to use
asynchronous API.
1async function decodeFromBlob(blob: Blob): unknown { 2 if (blob.stream) { 3 // Blob#stream(): ReadableStream<Uint8Array> (recommended) 4 return await decodeAsync(blob.stream()); 5 } else { 6 // Blob#arrayBuffer(): Promise<ArrayBuffer> (if stream() is not available) 7 return decode(await blob.arrayBuffer()); 8 } 9}
This library is compatible with the "August 2017" revision of MessagePack specification at the point where timestamp ext was added:
The livinng specification is here:
https://github.com/msgpack/msgpack
Note that as of June 2019 there're no official "version" on the MessagePack specification. See https://github.com/msgpack/msgpack/issues/195 for the discussions.
The following table shows how JavaScript values are mapped to MessagePack formats and vice versa.
Source Value | MessagePack Format | Value Decoded |
---|---|---|
null, undefined | nil | null (*1) |
boolean (true, false) | bool family | boolean (true, false) |
number (53-bit int) | int family | number (53-bit int) |
number (64-bit float) | float family | number (64-bit float) |
string | str family | string |
ArrayBufferView | bin family | Uint8Array (*2) |
Array | array family | Array |
Object | map family | Object (*3) |
Date | timestamp ext family | Date (*4) |
null
and undefined
are mapped to nil
(0xC0
) type, and are decoded into null
ArrayBufferView
s including NodeJS's Buffer
are mapped to bin
family, and are decoded into Uint8Array
Object
, it is regarded as Record<string, unknown>
in terms of TypeScriptDate
. This behavior can be overridden by registering -1
for the extension codec.This is a universal JavaScript library that supports major browsers and NodeJS.
ES2018 standard library used in this library can be polyfilled with core-js.
If you support IE11, import core-js
in your application entrypoints, as this library does in testing for browsers.
NodeJS v10 is required, but NodeJS v12 or later is recommended because it includes the V8 feature of Improving DataView performance in V8.
NodeJS before v10 will work by importing @msgpack/msgpack/dist.es5/msgpack
.
Run-time performance is not the only reason to use MessagePack, but it's important to choose MessagePack libraries, so a benchmark suite is provided to monitor the performance of this library.
V8's built-in JSON has been improved for years, esp. JSON.parse()
is significantly improved in V8/7.6, it is the fastest deserializer as of 2019, as the benchmark result bellow suggests.
However, MessagePack can handles binary data effectively, actual performance depends on situations. You'd better take benchmark on your own use-case if performance matters.
Benchmark on NodeJS/v12.18.3 (V8/7.8)
operation | op | ms | op/s |
---|---|---|---|
buf = Buffer.from(JSON.stringify(obj)); | 840700 | 5000 | 168140 |
buf = JSON.stringify(obj); | 1249800 | 5000 | 249960 |
obj = JSON.parse(buf); | 1648000 | 5000 | 329600 |
buf = require("msgpack-lite").encode(obj); | 603500 | 5000 | 120700 |
obj = require("msgpack-lite").decode(buf); | 315900 | 5000 | 63180 |
buf = require("@msgpack/msgpack").encode(obj); | 945400 | 5000 | 189080 |
obj = require("@msgpack/msgpack").decode(buf); | 770200 | 5000 | 154040 |
buf = /* @msgpack/msgpack */ encoder.encode(obj); | 1162600 | 5000 | 232520 |
obj = /* @msgpack/msgpack */ decoder.decode(buf); | 787800 | 5000 | 157560 |
Note that Buffer.from()
for JSON.stringify()
is necessary to emulate I/O where a JavaScript string must be converted into a byte array encoded in UTF-8, whereas MessagePack's encode()
returns a byte array.
The NPM package distributed in npmjs.com includes both ES2015+ and ES5 files:
dist/
is compiled into ES2015+dist.es5/
is compiled into ES5 and bundled to singile file
dist.es5/msgpack.min.js
- the default, minified file (UMD)dist.es5/msgpack.js
- an optional, non-minified file (UMD)If you use NodeJS and/or webpack, their module resolvers use the suitable one automatically.
This library is availble via CDN:
1<script crossorigin src="https://unpkg.com/@msgpack/msgpack"></script>
It loads MessagePack
module to the global object.
For simple testing:
npm run test
This library uses Travis CI.
test matrix:
target=es2019
/ target=es5
See test:* in package.json and .travis.yml for details.
1# run tests on NodeJS, Chrome, and Firefox 2make test-all 3 4# edit the changelog 5code CHANGELOG.md 6 7# bump version 8npm version patch|minor|major 9 10# run the publishing task 11make publish
1npm run update-dependencies
Cross-browser Testing Platform and Open Source <3 Provided by Sauce Labs.
Copyright 2019 The MessagePack community.
This software uses the ISC license:
https://opensource.org/licenses/ISC
See LICENSE for details.
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
license file detected
Details
Reason
Found 0/30 approved changesets -- score normalized to 0
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
no SAST tool detected
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
70 existing vulnerabilities detected
Details
Score
Last Scanned on 2025-07-07
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More