Gathering detailed insights and metrics for kashe
Gathering detailed insights and metrics for kashe
Gathering detailed insights and metrics for kashe
Gathering detailed insights and metrics for kashe
npm install kashe
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
68 Stars
48 Commits
2 Forks
3 Watching
26 Branches
2 Contributors
Updated on 06 Jul 2024
TypeScript (94.67%)
JavaScript (5.33%)
Cumulative downloads
Total Downloads
Last day
25.2%
879
Compared to previous day
Last week
20.3%
4,348
Compared to previous week
Last month
-6.8%
17,121
Compared to previous month
Last year
3.6%
216,039
Compared to previous year
3
4
A WeakMap based memoization library for a better and safer caching
Memoization is cool technique. But is it reliable and safe?
What is the difference between lodash.memoize
, memoize-one
, and React.useMemo
?
useMemo
is React. You cannot use it outside of Functional Component.What about reselect
, a tool powering up all the redux
ecosystem? Still - single cache item.
Reselect 5.0 migrated to
weakMapMemoize
which effectively is akashe
So - it's time to fix all the problems above. Wanna know more - read the article
In short - to better REMEMBER something, you have to better FORGET it
TLDR:
kashe
uses passed arguments as a key to an internal WeakMap to store a result. It does not store anything permanently - it's always weak. Once argument used as a key is "gone" - data is "gone".
kashe(function: T):T
- transparent weak memoization. Requires first argument to be an object or array or function. The
first argument would be used to store a result.1import {kashe} from 'kashe'; 2 3const selector = state => [state.a, state.b]; 4const memoizedSelector = kashe(selector); 5memoizedSelector(state) === memoizedSelector(state); 6 7const complexSelector = (state, field) => ({ field: state[field]}); 8const memoizedComplexSelector = kashe(complexSelector); 9memoizedComplexSelector(state, 'a') === memoizedComplexSelector(state, 'a');
kashe
uses nested memoization, creating a cascade of caches for every "weakmappable" argument found.
This greatly increases cache size and hit rate.
For the cases like selectors and mappers some times it's easier to use not strict cache.
1const {weakKashe} from 'kashe'; 2 3const weakMap = weakKashe((data, iterator, ...deps) => data.map(iterator)); 4 5const derived = weakMap(data, line => ({...line, somethingElse}), localVariable1);
In this case:
data
kashe
argument, always the new function, would not destroy cacheboxed(function(...args)=>T):(_, ...args)=>T
- "prefixes" a call to function with "weakmappable" argument. All arguments shall be equal to return a cached result.
Use boxed
to make any function kashe-memoizable, buy adding a leading argument.Literally "puts function in a box".
boxed
effectively creates a "fork" in the cache letting you fine control execution
1import {boxed} from 'kashe'; 2 3const addTwo = (a,b) => a+b; // could not be "kashe" memoized 4const bAddTwo = boxed(addTwo); 5const cacheKey = {}; // any object 6 7bAddTwo(cacheKey, 1, 2) === bAddTwo(cacheKey, 1, 2) === 3 8bAddTwo(otherCacheKey, 1, 2) // -> a new call 9 10bAddTwo(cacheKey, 10, 20) // -> a new call - arguments dont match 11bAddTwo(cacheKey, 1, 2) // -> a new call - original result replaced by 10+20
inboxed(function(...args)=>T):(_, ...args)=>T
- "nest" a call to a function with "weakmappable" argument.
Use inboxed
to make any function kashe-memoizable, buy adding a leading argument.Difference from
boxed
-inboxed
"nests" all the cache below it. It creates a micro-universe giving you fine control over caching behavior and purging caches. Think concurrent server environment.
inboxed
will affect every other kashe
call inside it.
1import {inboxed} from 'kashe'; 2 3const selector = (state) => ({state}) // could be "kashe"-memoized 4const memoizedSelector = kashe(selector); 5 6const bSelector = boxed(memoizedSelector); 7const ibSelector = inboxed(memoizedSelector); 8const cacheKey = {}; // any object 9 10ibSelector(cacheKey, state) === ibSelector(cacheKey, state) 11ibSelector(otherCacheKey, state) // a new call. Other key used for inbox, and other cache would be used for memoizedSelector 12ibSelector(cacheKey, otherState) // a new call 13ibSelector(cacheKey, state) // cacheKey has cache for `state` 14 15// but! 16bSelector(cacheKey, state) === bSelector(otherCacheKey, state) 17 18// bSelector is not "sharing" it's own result (key is different), but underlaying 19// `memoizedSelector` shares, and `state` argument is the same.
boxed
could increase probability to cache a valueinboxed
could decrease probability to cache a valueinboxed
is scoping all the nested caches behind a first argument. It if changes - cache changes.
Yet again - first argument is WHERE cache is stored.
boxed
is just storing result in a first argument. If cache is not found it is still possible to discover
it in a nested cache.
1const memoizedSelector = kashe(selector);
2
3const inboxedSelector = inboxed(memoizedSelector);
4const boxedSelector = boxed(memoizedSelector);
5
6// state1 !== state2. selectors would use different caches, memoizedSelector included
7inboxedSelector(state1, data) !== inboxedSelector(state2, data)
8
9// state1 !== state2. memoization would fail, but memoizedSelector would return the same values
10 boxedSelector(state1, data) === boxedSelector(state2, data)
inboxedSelector
is more memory safe, but CPU intensive. It guratines all selectors would be clean for a session(first argument).
boxedSelector
is useful as long as everything here is still holds only ONE result. It may be wiped from nested selector, but still exists in a boxed
1memoizedSelector(data1); 2boxedSelector(state, data1); // they are the same 3boxedSelector(state, data2); // updating cache for both selectors 4memoizedSelector(data2); // they are the same 5memoizedSelector(data1); // cache is updated 6boxedSelector(state, data2); // !!!! result is still stored in `state`
fork(function: T):T
- create a copy of a selector, with overidden internal cache.
fork
has the same effect inbox
has, but not adding a leading argument. First argument still expected to be an object, array, or a function.1const selector = (state) => ({state}); 2 3const memoized = kashe(selector); 4memoized(state) === memoized(state); 5 6const forked = fork(memoized); 7memoized(state) !== memoized(state);
1.01 kb
Let's imagine a simple HOC
1const hoc = WrappedComponent => <SomeStuff><WrappedComponent/></SomeStuff>;
You want to call this function 10 times, and always get the same result
1hoc(ComponentA); 2hoc(ComponentA); // !!! a new call === a new result, a new component, so remount! We dont need it. 3 4const memoizedHoc = memoizeOne(hoc); 5 6memoizedHoc(ComponentA); 7memoizedHoc(ComponentA); // YES! It works as expected! 8memoizedHoc(ComponentB); // BAM! Previous result got wiped 9memoizedHoc(ComponentA); // A new result, and BAM! Previous result got wiped 10 11const kasheHoc = kashe(hoc); 12 13kasheHoc(ComponentA); 14kasheHoc(ComponentA); // YES! It works as expected! 15kasheHoc(ComponentB); // YES! It works as expected! Result is stored in a first argument. 16kasheHoc(ComponentA); // YES! It works as expected! Result is still inside ComponentA
But what about concurrent execution, where scope may matter, and where you dont want to leave any traces?
1// first client 2kasheHoc(ComponentA); 3// second client 4kasheHoc(ComponentA); // We got cached result :( 5 6// lets fix, and "prefix" selector 7// using `box` for memoized `kasheHoc` would nullify the effect. 8 9const boxedKasheHoc = inbox(kasheHoc); 10// first client 11boxedKasheHoc(client1Key, ComponentA); 12// second client 13boxedKasheHoc(client2Key, ComponentA); // another client key - another memoization! 14boxedKasheHoc(client2Key, ComponentB); // another argument key - another memoization! 15boxedKasheHoc(client2Key, ComponentA); // result is cached
A Reselect
-compatible API
TLDR: it just replaces default memoization for reselect -
createSelectorCreator(strongMemoize);
.strongMemoize
- is not public API yet.
Reselect is a great library, but it has one limitation - stores only one result. There are a few attempts to "fix" it
Magically - kashe
is ideally compatible with reselect
API
1import {createSelector} from 'kashe/reselect' 2 3const getDataSlice = (state, props) => state[props.sliceId] 4const dataSelector = createSelector(getDataSlice, slice => ({slice})) // lets make it harder 5 6const slice1Value = dataSelector(state, { sliceId: 1 }); 7const slice2Value = dataSelector(state, { sliceId: 2 }); 8// the real `reselect` would replace stored value by a new one 9 10const unknownValue = dataSelector(state, { sliceId: 1 }); 11// the real `reselect` would return a new object here 12 13 14// `kashe/reselect` - would return `slice1Value`
Error: No weak-mappable object found to read a cache from.
If all selectors returned a non "weak-mappable" object (like array, object, function, symbol) - kashe would throw. This is intentional, as long as it stores cache inside such objects, and without them it could not work. However, if you think that it should work that way - just give it that "cache"
1const cache = {}; 2const selector = createSelector( 3 someSelector, 4 () => cache, // <---- cache for a selector 5 selectedData => {/*...*/} 6);
kashe
could not replace memoize-one
as long as it requires at least one argument to be a object or array.
But if at least one is in list - go for it.
You may use React.useRef/useState/Context to create and propagate a per-instance, or per-tree variable, you may use
for kashe
1const KasheContext = React.createContext(); 2// create a "value provider". useRef would give you an object you may use 3const CacheKeyProvider = ({children}) => ( 4 <KasheContext.Provider value={useRef(null)}>{children}</KasheContext.Provider> 5); 6 7const memoizedFunction = kashe(aFunction); 8 9const OtherComponent = () => { 10 const kasheKey = useContext(KasheContext); 11 const localKasheKey = useRef(); 12 // use per-render key to store data 13 const memoizedData1 = memoizedFunction(kasheKey, firstArgument, secondArgument); 14 // use per-instance key to store data 15 const memoizedData2 = memoizedFunction(localKasheKey, firstArgument, secondArgument); 16}
So - almost the same as React.useMemo
, but you might use it in Class Components and mapStateToProps
.
See Don’t Stop the Data Flow in Rendering for details about memoization in react.
1 2// wrap slowlyCalculateTextColor with leading "state" argument 3const generateTextColor = boxed(slowlyCalculateTextColor); 4 5class MyComponent extends React.Component { 6 // ... 7 render () { 8 // use `this` as `state` 9 const textColor = generateTextColor(this, this.props.color); 10 return ( 11 <button className={'Button-' + color + ' Button-text-' + textColor}> 12 {children} 13 </button> 14 ); 15 } 16}
1const mapStateToProps = () => { 2 const selector1 = fork(selectors.selector1); 3 return state => ({ 4 value1: selector1(state), // "per-instance" selector 5 value2: selectors.selector2(box, state), // normal selector 6 value3: memoizedFunction(selector1, state.data), // use "selector1" as a cache-key for another function 7 }) 8};
The nearest analog of kashe
is weak-memoize, but it does accept only one argument.
1// a simple one argument function 2memoize-one one argument x 58,277,071 ops/sec ±1.60% (87 runs sampled) 3kashe one argument x 19,724,367 ops/sec ±0.76% (91 runs sampled) 4 5// a simple two arguments function 6memoize-one two arguments x 42,526,871 ops/sec ±0.77% (90 runs sampled) 7kashe two arguments x 16,929,449 ops/sec ±0.84% (89 runs sampled) 8 9// using more than one object to call - memoize-one is failing, while kashe still works 10// PS: multiply results by 2 11memoize-one two states x 308,917 ops/sec ±0.56% (92 runs sampled) 12kashe two states x 8,992,170 ops/sec ±0.96% (83 runs sampled)
When I first time I heard my nickname - kashey
pronounces as cache
- I decides to create a caching library one day. Here we go.
MIT
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
Found 1/24 approved changesets -- score normalized to 0
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
license file not detected
Details
Reason
project is not fuzzed
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Reason
120 existing vulnerabilities detected
Details
Score
Last Scanned on 2024-11-18
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More