Installations
npm install micro-memoize-import-type
Developer Guide
Typescript
Yes
Module System
CommonJS, UMD
Node Version
12.7.0
NPM Version
6.9.0
Score
75.3
Supply Chain
99.5
Quality
75.3
Maintenance
100
Vulnerability
100
License
Releases
Release 4.2.0-beta.0
Published on 05 Jan 2025
Release 4.1.3
Published on 05 Jan 2025
Release 5.0.0-beta.2
Published on 18 Jan 2024
Release 5.0.0-beta.1
Published on 08 Jan 2024
Release 5.0.0-beta.0
Published on 08 Jan 2024
Release 4.1.2
Published on 08 May 2023
Contributors
Languages
TypeScript (78.94%)
JavaScript (21.06%)
Developer
planttheidea
Download Statistics
Total Downloads
860
Last Day
1
Last Week
3
Last Month
12
Last Year
138
GitHub Statistics
242 Stars
265 Commits
16 Forks
6 Watching
10 Branches
7 Contributors
Bundle Size
3.76 kB
Minified
1.45 kB
Minified + Gzipped
Package Meta Information
Latest Version
4.0.10
Package Id
micro-memoize-import-type@4.0.10
Unpacked Size
238.65 kB
Size
45.63 kB
File Count
30
NPM Version
6.9.0
Node Version
12.7.0
Total Downloads
Cumulative downloads
Total Downloads
860
Last day
0%
1
Compared to previous day
Last week
0%
3
Compared to previous week
Last month
71.4%
12
Compared to previous month
Last year
-43.4%
138
Compared to previous year
Daily Downloads
Weekly Downloads
Monthly Downloads
Yearly Downloads
Dev Dependencies
43
micro-memoize
A tiny, crazy fast memoization library for the 95% use-case
Table of contents
- micro-memoize
Summary
As the author of moize
, I created a consistently fast memoization library, but moize
has a lot of features to satisfy a large number of edge cases. micro-memoize
is a simpler approach, focusing on the core feature set with a much smaller footprint (~1.5kB minified+gzipped). Stripping out these edge cases also allows micro-memoize
to be faster across the board than moize
.
Importing
ESM in browsers:
1import memoize from 'micro-memoize';
ESM in NodeJS:
1import memoize from 'micro-memoize/mjs';
CommonJS:
1const memoize = require('micro-memoize');
Usage
1const assembleToObject = (one: string, two: string) => ({ one, two });
2
3const memoized = memoize(assembleToObject);
4
5console.log(memoized('one', 'two')); // {one: 'one', two: 'two'}
6console.log(memoized('one', 'two')); // pulled from cache, {one: 'one', two: 'two'}
Types
If you need them, all types are available under the MicroMemoize
namespace.
1import { MicroMemoize } from 'micro-memoize';
Composition
Starting in 4.0.0
, you can compose memoized functions if you want to have multiple types of memoized versions based on different options.
1const simple = memoized(fn); // { maxSize: 1 } 2const upToFive = memoized(simple, { maxSize: 5 }); // { maxSize: 5 } 3const withCustomEquals = memoized(upToFive, { isEqual: deepEqual }); // { maxSize: 5, isEqual: deepEqual }
NOTE: The original function is the function used in the composition, the composition only applies to the options. In the example above, upToFive
does not call simple
, it calls fn
.
Options
isEqual
function(object1: any, object2: any): boolean
, defaults to isSameValueZero
Custom method to compare equality of keys, determining whether to pull from cache or not, by comparing each argument in order.
Common use-cases:
- Deep equality comparison
- Limiting the arguments compared
1import { deepEqual } from 'fast-equals'; 2 3type ContrivedObject = { 4 deep: string; 5}; 6 7const deepObject = (object: { 8 foo: ContrivedObject; 9 bar: ContrivedObject; 10}) => ({ 11 foo: object.foo, 12 bar: object.bar, 13}); 14 15const memoizedDeepObject = memoize(deepObject, { isEqual: deepEqual }); 16 17console.log( 18 memoizedDeepObject({ 19 foo: { 20 deep: 'foo', 21 }, 22 bar: { 23 deep: 'bar', 24 }, 25 baz: { 26 deep: 'baz', 27 }, 28 }), 29); // {foo: {deep: 'foo'}, bar: {deep: 'bar'}} 30 31console.log( 32 memoizedDeepObject({ 33 foo: { 34 deep: 'foo', 35 }, 36 bar: { 37 deep: 'bar', 38 }, 39 baz: { 40 deep: 'baz', 41 }, 42 }), 43); // pulled from cache
NOTE: The default method tests for SameValueZero equality, which is summarized as strictly equal while also considering NaN
equal to NaN
.
isMatchingKey
function(object1: any[], object2: any[]): boolean
Custom method to compare equality of keys, determining whether to pull from cache or not, by comparing the entire key.
Common use-cases:
- Comparing the shape of the key
- Matching on values regardless of order
- Serialization of arguments
1import { deepEqual } from 'fast-equals';
2
3type ContrivedObject = { foo: string; bar: number };
4
5const deepObject = (object: ContrivedObject) => ({
6 foo: object.foo,
7 bar: object.bar,
8});
9
10const memoizedShape = memoize(deepObject, {
11 // receives the full key in cache and the full key of the most recent call
12 isMatchingKey(key1, key2) {
13 const object1 = key1[0];
14 const object2 = key2[0];
15
16 return (
17 object1.hasOwnProperty('foo') &&
18 object2.hasOwnProperty('foo') &&
19 object1.bar === object2.bar
20 );
21 },
22});
23
24console.log(
25 memoizedShape({
26 foo: 'foo',
27 bar: 123,
28 baz: 'baz',
29 }),
30); // {foo: {deep: 'foo'}, bar: {deep: 'bar'}}
31
32console.log(
33 memoizedShape({
34 foo: 'not foo',
35 bar: 123,
36 baz: 'baz',
37 }),
38); // pulled from cache
isPromise
boolean
, defaults to false
Identifies the value returned from the method as a Promise
, which will result in one of two possible scenarios:
- If the promise is resolved, it will fire the
onCacheHit
andonCacheChange
options - If the promise is rejected, it will trigger auto-removal from cache
1const fn = async (one: string, two: string) => { 2 return new Promise((resolve, reject) => { 3 setTimeout(() => { 4 reject(new Error(JSON.stringify({ one, two }))); 5 }, 500); 6 }); 7}; 8 9const memoized = memoize(fn, { isPromise: true }); 10 11memoized('one', 'two'); 12 13console.log(memoized.cache.snapshot.keys); // [['one', 'two']] 14console.log(memoized.cache.snapshot.values); // [Promise] 15 16setTimeout(() => { 17 console.log(memoized.cache.snapshot.keys); // [] 18 console.log(memoized.cache.snapshot.values); // [] 19}, 1000);
NOTE: If you don't want rejections to auto-remove the entry from cache, set isPromise
to false
(or simply do not set it), but be aware this will also remove the cache listeners that fire on successful resolution.
maxSize
number
, defaults to 1
The number of values to store in cache, based on a Least Recently Used basis. This operates the same as maxSize
on moize
, with the exception of the default being different.
1const manyPossibleArgs = (one: string, two: string) => [one, two]; 2 3const memoized = memoize(manyPossibleArgs, { maxSize: 3 }); 4 5console.log(memoized('one', 'two')); // ['one', 'two'] 6console.log(memoized('two', 'three')); // ['two', 'three'] 7console.log(memoized('three', 'four')); // ['three', 'four'] 8 9console.log(memoized('one', 'two')); // pulled from cache 10console.log(memoized('two', 'three')); // pulled from cache 11console.log(memoized('three', 'four')); // pulled from cache 12 13console.log(memoized('four', 'five')); // ['four', 'five'], drops ['one', 'two'] from cache
NOTE: The default for micro-memoize
differs from the default implementation of moize
. moize
will store an infinite number of results unless restricted, whereas micro-memoize
will only store the most recent result. In this way, the default implementation of micro-memoize
operates more like moize.simple
.
onCacheAdd
function(cache: Cache, options: Options): void
Callback method that executes whenever the cache is added to. This is mainly to allow for higher-order caching managers that use micro-memoize
to perform superset functionality on the cache
object.
1const fn = (one: string, two: string) => [one, two]; 2 3const memoized = memoize(fn, { 4 maxSize: 2, 5 onCacheAdd(cache, options) { 6 console.log('cache has been added to: ', cache); 7 console.log('memoized method has the following options applied: ', options); 8 }, 9}); 10 11memoized('foo', 'bar'); // cache has been added to 12memoized('foo', 'bar'); 13memoized('foo', 'bar'); 14 15memoized('bar', 'foo'); // cache has been added to 16memoized('bar', 'foo'); 17memoized('bar', 'foo'); 18 19memoized('foo', 'bar'); 20memoized('foo', 'bar'); 21memoized('foo', 'bar');
NOTE: This method is not executed when the cache
is manually manipulated, only when changed via calling the memoized method.
onCacheChange
function(cache: Cache, options: Options): void
Callback method that executes whenever the cache is added to or the order is updated. This is mainly to allow for higher-order caching managers that use micro-memoize
to perform superset functionality on the cache
object.
1const fn = (one: string, two: string) => [one, two]; 2 3const memoized = memoize(fn, { 4 maxSize: 2, 5 onCacheChange(cache, options) { 6 console.log('cache has changed: ', cache); 7 console.log('memoized method has the following options applied: ', options); 8 }, 9}); 10 11memoized('foo', 'bar'); // cache has changed 12memoized('foo', 'bar'); 13memoized('foo', 'bar'); 14 15memoized('bar', 'foo'); // cache has changed 16memoized('bar', 'foo'); 17memoized('bar', 'foo'); 18 19memoized('foo', 'bar'); // cache has changed 20memoized('foo', 'bar'); 21memoized('foo', 'bar');
NOTE: This method is not executed when the cache
is manually manipulated, only when changed via calling the memoized method. When the execution of other cache listeners (onCacheAdd
, onCacheHit
) is applicable, this method will execute after those methods.
onCacheHit
function(cache: Cache, options: Options): void
Callback method that executes whenever the cache is hit, whether the order is updated or not. This is mainly to allow for higher-order caching managers that use micro-memoize
to perform superset functionality on the cache
object.
1const fn = (one: string, two: string) => [one, two]; 2 3const memoized = memoize(fn, { 4 maxSize: 2, 5 onCacheHit(cache, options) { 6 console.log('cache was hit: ', cache); 7 console.log('memoized method has the following options applied: ', options); 8 }, 9}); 10 11memoized('foo', 'bar'); 12memoized('foo', 'bar'); // cache was hit 13memoized('foo', 'bar'); // cache was hit 14 15memoized('bar', 'foo'); 16memoized('bar', 'foo'); // cache was hit 17memoized('bar', 'foo'); // cache was hit 18 19memoized('foo', 'bar'); // cache was hit 20memoized('foo', 'bar'); // cache was hit 21memoized('foo', 'bar'); // cache was hit
NOTE: This method is not executed when the cache
is manually manipulated, only when changed via calling the memoized method.
transformKey
function(Array<any>): any
A method that allows you transform the key that is used for caching, if you want to use something other than the pure arguments.
1const ignoreFunctionArgs = (one: string, two: () => {}) => [one, two]; 2 3const memoized = memoize(ignoreFunctionArgs, { 4 transformKey: JSON.stringify, 5}); 6 7console.log(memoized('one', () => {})); // ['one', () => {}] 8console.log(memoized('one', () => {})); // pulled from cache, ['one', () => {}]
If your transformed keys require something other than SameValueZero
equality, you can combine transformKey
with isEqual
for completely custom key creation and comparison.
1const ignoreFunctionArg = (one: string, two: () => void) => [one, two]; 2 3const memoized = memoize(ignoreFunctionArg, { 4 isMatchingKey: (key1, key2) => key1[0] === key2[0], 5 // Cache based on the serialized first parameter 6 transformKey: (args) => [JSON.stringify(args[0])], 7}); 8 9console.log(memoized('one', () => {})); // ['one', () => {}] 10console.log(memoized('one', () => {})); // pulled from cache, ['one', () => {}]
Additional properties
memoized.cache
Object
The cache
object that is used internally. The shape of this structure:
1{ 2 keys: any[][], // available as MicroMemoize.Key[] 3 values: any[] // available as MicroMemoize.Value[] 4}
The exposure of this object is to allow for manual manipulation of keys/values (injection, removal, expiration, etc).
1const method = (one: string, two: string) => ({ one, two }); 2 3const memoized = memoize(method); 4 5memoized.cache.keys.push(['one', 'two']); 6memoized.cache.values.push('cached'); 7 8console.log(memoized('one', 'two')); // 'cached'
NOTE: moize
offers a variety of convenience methods for this manual cache
manipulation, and while micro-memoize
allows all the same capabilities by exposing the cache
, it does not provide any convenience methods.
memoized.cache.snapshot
Object
This is identical to the cache
object referenced above, but it is a deep clone created at request, which will provide a persistent snapshot of the values at that time. This is useful when tracking the cache changes over time, as the cache
object is mutated internally for performance reasons.
memoized.fn
function
The original function passed to be memoized.
memoized.isMemoized
boolean
Hard-coded to true
when the function is memoized. This is useful for introspection, to identify if a method has been memoized or not.
memoized.options
Object
The options
passed when creating the memoized method.
Benchmarks
All values provided are the number of operations per second (ops/sec) calculated by the Benchmark suite. Note that underscore
, lodash
, and ramda
do not support mulitple-parameter memoization (which is where micro-memoize
really shines), so they are not included in those benchmarks.
Benchmarks was performed on an i7 8-core Arch Linux laptop with 16GB of memory using NodeJS version 10.15.0
. The default configuration of each library was tested with a fibonacci calculation based on the following parameters:
- Single primitive =
35
- Single object =
{number: 35}
- Multiple primitives =
35, true
- Multiple objects =
{number: 35}, {isComplete: true}
NOTE: Not all libraries tested support multiple parameters out of the box, but support the ability to pass a custom resolver
. Because these often need to resolve to a string value, a common suggestion is to just JSON.stringify
the arguments, so that is what is used when needed.
Single parameter (primitive only)
This is usually what benchmarks target for ... its the least-likely use-case, but the easiest to optimize, often at the expense of more common use-cases.
Operations / second | |
---|---|
fast-memoize | 59,069,204 |
micro-memoize | 48,267,295 |
lru-memoize | 46,781,143 |
Addy Osmani | 32,372,414 |
lodash | 29,297,916 |
ramda | 25,054,838 |
mem | 24,848,072 |
underscore | 24,847,818 |
memoizee | 18,272,987 |
memoizerific | 7,302,835 |
Single parameter (complex object)
This is what most memoization libraries target as the primary use-case, as it removes the complexities of multiple arguments but allows for usage with one to many values.
Operations / second | |
---|---|
micro-memoize | 40,360,621 |
lodash | 30,862,028 |
lru-memoize | 25,740,572 |
memoizee | 12,058,375 |
memoizerific | 6,854,855 |
ramda | 2,287,030 |
underscore | 2,270,574 |
Addy Osmani | 2,076,031 |
mem | 2,001,984 |
fast-memoize | 1,591,019 |
Multiple parameters (primitives only)
This is a very common use-case for function calls, but can be more difficult to optimize because you need to account for multiple possibilities ... did the number of arguments change, are there default arguments, etc.
Operations / second | |
---|---|
micro-memoize | 33,546,353 |
lru-memoize | 20,884,669 |
memoizee | 7,831,161 |
Addy Osmani | 6,447,448 |
memoizerific | 5,587,779 |
mem | 2,620,943 |
underscore | 1,617,687 |
ramda | 1,569,167 |
lodash | 1,512,515 |
fast-memoize | 1,376,665 |
Multiple parameters (complex objects)
This is the most robust use-case, with the same complexities as multiple primitives but managing bulkier objects with additional edge scenarios (destructured with defaults, for example).
Operations / second | |
---|---|
micro-memoize | 34,857,438 |
lru-memoize | 20,838,330 |
memoizee | 7,820,066 |
memoizerific | 5,761,357 |
mem | 1,184,550 |
ramda | 1,034,937 |
underscore | 1,021,480 |
Addy Osmani | 1,014,642 |
lodash | 1,014,060 |
fast-memoize | 949,213 |
Browser support
- Chrome (all versions)
- Firefox (all versions)
- Edge (all versions)
- Opera 15+
- IE 9+
- Safari 6+
- iOS 8+
- Android 4+
Node support
- 4+
Development
Standard stuff, clone the repo and npm install
dependencies. The npm scripts available:
build
=> run webpack to build developmentdist
file with NODE_ENV=developmentbuild:minifed
=> run webpack to build productiondist
file with NODE_ENV=productiondev
=> run webpack dev server to run example app (playground!)dist
=> runsbuild
andbuild-minified
lint
=> run ESLint against all files in thesrc
folderprepublish
=> runscompile-for-publish
prepublish:compile
=> runlint
,test
,transpile:es
,transpile:lib
,dist
test
=> run AVA test functions withNODE_ENV=test
test:coverage
=> runtest
but withnyc
for coverage checkertest:watch
=> runtest
, but with persistent watchertranspile:lib
=> run babel against all files insrc
to create files inlib
transpile:es
=> run babel against all files insrc
to create files ines
, preserving ES2015 modules (forpkg.module
)
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
license file detected
Details
- Info: project has a license file: LICENSE:0
- Info: FSF or OSI recognized license: MIT License: LICENSE:0
Reason
3 commit(s) and 3 issue activity found in the last 90 days -- score normalized to 5
Reason
Found 1/18 approved changesets -- score normalized to 0
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
- Warn: no security policy file detected
- Warn: no security file to analyze
- Warn: no security file to analyze
- Warn: no security file to analyze
Reason
project is not fuzzed
Details
- Warn: no fuzzer integrations found
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
- Warn: 0 commits out of 18 are checked with a SAST tool
Reason
30 existing vulnerabilities detected
Details
- Warn: Project is vulnerable to: GHSA-qwcr-r2fm-qrc7
- Warn: Project is vulnerable to: GHSA-grv7-fg5c-xmjg
- Warn: Project is vulnerable to: GHSA-pxg6-pf52-xh8x
- Warn: Project is vulnerable to: GHSA-3xgq-45jj-v275
- Warn: Project is vulnerable to: GHSA-4gmj-3p3h-gm8h
- Warn: Project is vulnerable to: GHSA-rv95-896h-c2vc
- Warn: Project is vulnerable to: GHSA-qw6h-vgh9-j6wx
- Warn: Project is vulnerable to: GHSA-cxjh-pqwp-8mfp
- Warn: Project is vulnerable to: GHSA-c7qv-q95q-8v27
- Warn: Project is vulnerable to: GHSA-78xj-cgh5-2h22
- Warn: Project is vulnerable to: GHSA-2p57-rm9w-gvfp
- Warn: Project is vulnerable to: GHSA-jf85-cpcp-j695
- Warn: Project is vulnerable to: GHSA-fvqr-27wr-82fm
- Warn: Project is vulnerable to: GHSA-4xc9-xhrj-v574
- Warn: Project is vulnerable to: GHSA-x5rq-j2xg-h7qm
- Warn: Project is vulnerable to: GHSA-p6mc-m468-83gw
- Warn: Project is vulnerable to: GHSA-29mw-wpgm-hmr9
- Warn: Project is vulnerable to: GHSA-35jh-r3h4-6jhm
- Warn: Project is vulnerable to: GHSA-952p-6rrq-rcjv
- Warn: Project is vulnerable to: GHSA-9wv6-86v2-598j
- Warn: Project is vulnerable to: GHSA-rhx6-c78j-4q9w
- Warn: Project is vulnerable to: GHSA-gcx4-mw62-g8wm
- Warn: Project is vulnerable to: GHSA-c2qf-rxjj-qqgw
- Warn: Project is vulnerable to: GHSA-m6fv-jmcg-4jfg
- Warn: Project is vulnerable to: GHSA-cm22-4g7w-348p
- Warn: Project is vulnerable to: GHSA-cchq-frgv-rjh5
- Warn: Project is vulnerable to: GHSA-g644-9gfx-q4q4
- Warn: Project is vulnerable to: GHSA-4vvj-4cpr-p986
- Warn: Project is vulnerable to: GHSA-wr3j-pwj9-hqq6
- Warn: Project is vulnerable to: GHSA-3h5v-q93c-6h6q
Score
2.8
/10
Last Scanned on 2025-01-27
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More