Gathering detailed insights and metrics for tiny-lru
Gathering detailed insights and metrics for tiny-lru
Gathering detailed insights and metrics for tiny-lru
Gathering detailed insights and metrics for tiny-lru
npm install tiny-lru
Typescript
Module System
Min. Node Version
Node Version
NPM Version
JavaScript (100%)
Total Downloads
0
Last Day
0
Last Week
0
Last Month
0
Last Year
0
BSD-3-Clause License
164 Stars
571 Commits
25 Forks
2 Watchers
2 Branches
12 Contributors
Updated on Jul 11, 2025
Latest Version
11.3.3
Package Id
tiny-lru@11.3.3
Unpacked Size
37.60 kB
Size
7.90 kB
File Count
6
NPM Version
10.9.2
Node Version
23.10.0
Published on
Jun 18, 2025
Cumulative downloads
Total Downloads
Last Day
0%
NaN
Compared to previous day
Last Week
0%
NaN
Compared to previous week
Last Month
0%
NaN
Compared to previous month
Last Year
0%
NaN
Compared to previous year
A lightweight, high-performance Least Recently Used (LRU) cache implementation for JavaScript with optional TTL (time-to-live) support. Works in both Node.js and browser environments.
1npm install tiny-lru
1import {lru} from "tiny-lru"; 2const cache = lru(max, ttl = 0, resetTtl = false);
1import {LRU} from "tiny-lru"; 2 3// Create a cache with 1000 items, 1 minute TTL, reset on access 4const cache = new LRU(1000, 60000, true); 5 6// Create a cache with TTL 7const cache2 = new LRU(100, 5000); // 100 items, 5 second TTL 8cache2.set('key1', 'value1'); 9// After 5 seconds, key1 will be expired
1import {LRU} from "tiny-lru"; 2class MyCache extends LRU { 3 constructor(max, ttl, resetTtl) { 4 super(max, ttl, resetTtl); 5 } 6}
{Number}
- Maximum number of items to store. 0 means unlimited (default: 1000){Number}
- Time-to-live in milliseconds, 0 disables expiration (default: 0){Boolean}
- Reset TTL on each set()
operation (default: false)The factory function validates parameters and throws TypeError
for invalid values:
1// Invalid parameters will throw TypeError 2try { 3 const cache = lru(-1); // Invalid max value 4} catch (error) { 5 console.error(error.message); // "Invalid max value" 6} 7 8try { 9 const cache = lru(100, -1); // Invalid ttl value 10} catch (error) { 11 console.error(error.message); // "Invalid ttl value" 12} 13 14try { 15 const cache = lru(100, 0, "true"); // Invalid resetTtl value 16} catch (error) { 17 console.error(error.message); // "Invalid resetTtl value" 18}
Compatible with Lodash's memoize
function cache interface:
1import _ from "lodash"; 2import {lru} from "tiny-lru"; 3 4_.memoize.Cache = lru().constructor; 5const memoized = _.memoize(myFunc); 6memoized.cache.max = 10;
Tiny-LRU maintains 100% code coverage:
1--------------|---------|----------|---------|---------|------------------- 2File | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s 3--------------|---------|----------|---------|---------|------------------- 4All files | 100 | 96.34 | 100 | 100 | 5 tiny-lru.cjs | 100 | 96.34 | 100 | 100 | 190,225,245 6--------------|---------|----------|---------|---------|-------------------
Tiny-LRU includes a comprehensive benchmark suite for performance analysis and comparison. The benchmark suite uses modern Node.js best practices and popular benchmarking tools.
modern-benchmark.js
) ⭐Comprehensive benchmark suite using Tinybench
Features:
Test categories:
performance-observer-benchmark.js
)Native Node.js performance measurement using Performance Observer
Features:
performance.timerify()
1# Run all modern benchmarks 2npm run benchmark:all 3 4# Run individual benchmark suites 5npm run benchmark:modern # Tinybench suite 6npm run benchmark:perf # Performance Observer suite 7 8# Or run directly 9node benchmarks/modern-benchmark.js 10node benchmarks/performance-observer-benchmark.js 11 12# Run with garbage collection exposed (for memory analysis) 13node --expose-gc benchmarks/modern-benchmark.js
┌─────────┬─────────────────────────────┬─────────────────┬────────────────────┬──────────┬─────────┐
│ (index) │ Task Name │ ops/sec │ Average Time (ns) │ Margin │ Samples │
├─────────┼─────────────────────────────┼─────────────────┼────────────────────┼──────────┼─────────┤
│ 0 │ 'set-random-empty-cache-100'│ '2,486,234' │ 402.21854775934 │ '±0.45%' │ 1243117 │
┌─────────────┬─────────┬────────────┬────────────┬────────────┬───────────────┬─────────┬────────┐
│ Function │ Calls │ Avg (ms) │ Min (ms) │ Max (ms) │ Median (ms) │ Std Dev │Ops/sec │
├─────────────┼─────────┼────────────┼────────────┼────────────┼───────────────┼─────────┼────────┤
│ lru.set │ 1000 │ 0.0024 │ 0.0010 │ 0.0156 │ 0.0020 │ 0.0012 │ 417292 │
For accurate benchmark results:
--expose-gc
for memory testsSee benchmarks/README.md
for complete documentation and advanced usage.
{Object|null}
- Item in first (least recently used) position
1const cache = lru(); 2cache.first; // null - empty cache
{Object|null}
- Item in last (most recently used) position
1const cache = lru(); 2cache.last; // null - empty cache
{Number}
- Maximum number of items to hold in cache
1const cache = lru(500); 2cache.max; // 500
{Boolean}
- Whether to reset TTL on each set()
operation
1const cache = lru(500, 5*6e4, true); 2cache.resetTtl; // true
{Number}
- Current number of items in cache
1const cache = lru(); 2cache.size; // 0 - empty cache
{Number}
- TTL in milliseconds (0 = no expiration)
1const cache = lru(100, 3e4); 2cache.ttl; // 30000
Removes all items from cache.
Returns: {Object}
LRU instance
1cache.clear();
Removes specified item from cache.
Parameters:
key
{String}
- Item keyReturns: {Object}
LRU instance
1cache.set('key1', 'value1'); 2cache.delete('key1'); 3console.log(cache.has('key1')); // false
Returns array of cache items as [key, value]
pairs.
Parameters:
keys
{Array}
- Optional array of specific keys to retrieve (defaults to all keys)Returns: {Array}
Array of [key, value]
pairs
1cache.set('a', 1).set('b', 2); 2console.log(cache.entries()); // [['a', 1], ['b', 2]] 3console.log(cache.entries(['a'])); // [['a', 1]]
Removes the least recently used item from cache.
Returns: {Object}
LRU instance
1cache.set('old', 'value').set('new', 'value'); 2cache.evict(); // Removes 'old' item
See also: setWithEvicted()
Gets expiration timestamp for cached item.
Parameters:
key
{String}
- Item keyReturns: {Number|undefined}
Expiration time (epoch milliseconds) or undefined if key doesn't exist
1const cache = new LRU(100, 5000); // 5 second TTL 2cache.set('key1', 'value1'); 3console.log(cache.expiresAt('key1')); // timestamp 5 seconds from now
Retrieves cached item and promotes it to most recently used position.
Parameters:
key
{String}
- Item keyReturns: {*}
Item value or undefined if not found/expired
1cache.set('key1', 'value1'); 2console.log(cache.get('key1')); // 'value1' 3console.log(cache.get('nonexistent')); // undefined
Checks if key exists in cache (without promoting it).
Parameters:
key
{String}
- Item keyReturns: {Boolean}
True if key exists and is not expired
1cache.set('key1', 'value1'); 2console.log(cache.has('key1')); // true 3console.log(cache.has('nonexistent')); // false
Returns array of all cache keys in LRU order (first = least recent).
Returns: {Array}
Array of keys
1cache.set('a', 1).set('b', 2); 2cache.get('a'); // Move 'a' to most recent 3console.log(cache.keys()); // ['b', 'a']
Stores item in cache as most recently used.
Parameters:
key
{String}
- Item keyvalue
{*}
- Item valueReturns: {Object}
LRU instance
1cache.set('key1', 'value1') 2 .set('key2', 'value2') 3 .set('key3', 'value3');
See also: get(), setWithEvicted()
Stores item and returns evicted item if cache was full.
Parameters:
key
{String}
- Item keyvalue
{*}
- Item valueReturns: {Object|null}
Evicted item {key, value, expiry, prev, next}
or null
1const cache = new LRU(2); 2cache.set('a', 1).set('b', 2); 3const evicted = cache.setWithEvicted('c', 3); // evicted = {key: 'a', value: 1, ...} 4if (evicted) { 5 console.log(`Evicted: ${evicted.key}`, evicted.value); 6}
Returns array of cache values.
Parameters:
keys
{Array}
- Optional array of specific keys to retrieve (defaults to all keys)Returns: {Array}
Array of values
1cache.set('a', 1).set('b', 2); 2console.log(cache.values()); // [1, 2] 3console.log(cache.values(['a'])); // [1]
1import {lru} from "tiny-lru"; 2 3// Create a cache with max 100 items 4const cache = lru(100); 5cache.set('key1', 'value1'); 6console.log(cache.get('key1')); // 'value1' 7 8// Method chaining 9cache.set("user:123", {name: "John", age: 30}) 10 .set("session:abc", {token: "xyz", expires: Date.now()}); 11 12const user = cache.get("user:123"); // Promotes to most recent 13console.log(cache.size); // 2
1import {LRU} from "tiny-lru"; 2 3const cache = new LRU(50, 5000); // 50 items, 5s TTL 4cache.set("temp-data", {result: "computed"}); 5 6setTimeout(() => { 7 console.log(cache.get("temp-data")); // undefined - expired 8}, 6000);
1const cache = lru(100, 10000, true); // Reset TTL on each set() 2cache.set("session", {user: "admin"}); 3// Each subsequent set() resets the 10s TTL
Copyright (c) 2025 Jason Mulligan
Licensed under the BSD-3 license.
No vulnerabilities found.
Reason
no dangerous workflow patterns detected
Reason
29 commit(s) and 2 issue activity found in the last 90 days -- score normalized to 10
Reason
no binaries found in the repo
Reason
license file detected
Details
Reason
0 existing vulnerabilities detected
Reason
branch protection is not maximal on development and all release branches
Details
Reason
Found 0/7 approved changesets -- score normalized to 0
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
dependency not pinned by hash detected -- score normalized to 0
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Score
Last Scanned on 2025-07-07
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More