Gathering detailed insights and metrics for lowdb
Gathering detailed insights and metrics for lowdb
Gathering detailed insights and metrics for lowdb
Gathering detailed insights and metrics for lowdb
@types/lowdb
Stub TypeScript definitions entry for lowdb, which provides its own types definitions
lodash-id
Use JavaScript objects as databases
@commonify/lowdb
CommonJS version of lowdb 3.0.0. See https://github.com/mifi/commonify
@seydx/lowdb
Tiny local JSON database for Node, Electron and the browser
npm install lowdb
Typescript
Module System
Min. Node Version
Node Version
NPM Version
99.2
Supply Chain
100
Quality
76.2
Maintenance
100
Vulnerability
100
License
JavaScript (99.12%)
Shell (0.88%)
Total Downloads
151,242,017
Last Day
167,299
Last Week
987,777
Last Month
4,184,574
Last Year
39,883,098
MIT License
22,050 Stars
545 Commits
951 Forks
243 Watchers
2 Branches
36 Contributors
Updated on May 15, 2025
Latest Version
7.0.1
Package Id
lowdb@7.0.1
Unpacked Size
22.34 kB
Size
7.10 kB
File Count
29
NPM Version
10.2.3
Node Version
20.10.0
Published on
Dec 26, 2023
Cumulative downloads
Total Downloads
Last Day
-2%
167,299
Compared to previous day
Last Week
2.5%
987,777
Compared to previous week
Last Month
2.3%
4,184,574
Compared to previous month
Last Year
29.3%
39,883,098
Compared to previous year
Simple to use type-safe local JSON database 🦉
Read or create db.json
1const db = await JSONFilePreset('db.json', { posts: [] })
Update data using Array.prototype.*
and automatically write to db.json
1const post = { id: 1, title: 'lowdb is awesome', views: 100 } 2await db.update(({ posts }) => posts.push(post))
1// db.json 2{ 3 "posts": [ 4 { "id": 1, "title": "lowdb is awesome", "views": 100 } 5 ] 6}
In the same spirit, query using native Array.prototype.*
1const { posts } = db.data 2const first = posts.at(0) 3const results = posts.filter((post) => post.title.includes('lowdb')) 4const post1 = posts.find((post) => post.id === 1) 5const sortedPosts = posts.toSorted((a, b) => a.views - b.views)
It's that simple.
Become a sponsor and have your company logo here 👉 GitHub Sponsors
1npm install lowdb
Lowdb is a pure ESM package. If you're having trouble using it in your project, please read this.
1import { JSONFilePreset } from 'lowdb/node' 2 3// Read or create db.json 4const defaultData = { posts: [] } 5const db = await JSONFilePreset('db.json', defaultData) 6 7// Update db.json 8await db.update(({ posts }) => posts.push('hello world')) 9 10// Alternatively you can call db.write() explicitely later 11// to write to db.json 12db.data.posts.push('hello world') 13await db.write()
1// db.json 2{ 3 "posts": [ "hello world" ] 4}
You can use TypeScript to check your data types.
1type Data = { 2 messages: string[] 3} 4 5const defaultData: Data = { messages: [] } 6const db = await JSONPreset<Data>('db.json', defaultData) 7 8db.data.messages.push('foo') // ✅ Success 9db.data.messages.push(1) // ❌ TypeScript error
You can extend lowdb with Lodash (or other libraries). To be able to extend it, we're not using JSONPreset
here. Instead, we're using lower components.
1import { Low } from 'lowdb' 2import { JSONFile } from 'lowdb/node' 3import lodash from 'lodash' 4 5type Post = { 6 id: number 7 title: string 8} 9 10type Data = { 11 posts: Post[] 12} 13 14// Extend Low class with a new `chain` field 15class LowWithLodash<T> extends Low<T> { 16 chain: lodash.ExpChain<this['data']> = lodash.chain(this).get('data') 17} 18 19const defaultData: Data = { 20 posts: [], 21} 22const adapter = new JSONFile<Data>('db.json', defaultData) 23 24const db = new LowWithLodash(adapter) 25await db.read() 26 27// Instead of db.data use db.chain to access lodash API 28const post = db.chain.get('posts').find({ id: 1 }).value() // Important: value() must be called to execute chain
See src/examples/
directory.
Lowdb provides four presets for common cases.
JSONFilePreset(filename, defaultData)
JSONFileSyncPreset(filename, defaultData)
LocalStoragePreset(name, defaultData)
SessionStoragePreset(name, defaultData)
See src/examples/
directory for usage.
Lowdb is extremely flexible, if you need to extend it or modify its behavior, use the classes and adapters below instead of the presets.
Lowdb has two classes (for asynchronous and synchronous adapters).
new Low(adapter, defaultData)
1import { Low } from 'lowdb' 2import { JSONFile } from 'lowdb/node' 3 4const db = new Low(new JSONFile('file.json'), {}) 5await db.read() 6await db.write()
new LowSync(adapterSync, defaultData)
1import { LowSync } from 'lowdb' 2import { JSONFileSync } from 'lowdb/node' 3 4const db = new LowSync(new JSONFileSync('file.json'), {}) 5db.read() 6db.write()
db.read()
Calls adapter.read()
and sets db.data
.
Note: JSONFile
and JSONFileSync
adapters will set db.data
to null
if file doesn't exist.
1db.data // === null 2db.read() 3db.data // !== null
db.write()
Calls adapter.write(db.data)
.
1db.data = { posts: [] } 2db.write() // file.json will be { posts: [] } 3db.data = {} 4db.write() // file.json will be {}
db.update(fn)
Calls fn()
then db.write()
.
1db.update((data) => { 2 // make changes to data 3 // ... 4}) 5// files.json will be updated
db.data
Holds your db content. If you're using the adapters coming with lowdb, it can be any type supported by JSON.stringify
.
For example:
1db.data = 'string' 2db.data = [1, 2, 3] 3db.data = { key: 'value' }
JSONFile
JSONFileSync
Adapters for reading and writing JSON files.
1import { JSONFile, JSONFileSync } from 'lowdb/node' 2 3new Low(new JSONFile(filename), {}) 4new LowSync(new JSONFileSync(filename), {})
Memory
MemorySync
In-memory adapters. Useful for speeding up unit tests. See src/examples/
directory.
1import { Memory, MemorySync } from 'lowdb' 2 3new Low(new Memory(), {}) 4new LowSync(new MemorySync(), {})
LocalStorage
SessionStorage
Synchronous adapter for window.localStorage
and window.sessionStorage
.
1import { LocalStorage, SessionStorage } from 'lowdb/browser' 2new LowSync(new LocalStorage(name), {}) 3new LowSync(new SessionStorage(name), {})
TextFile
TextFileSync
Adapters for reading and writing text. Useful for creating custom adapters.
DataFile
DataFileSync
Adapters for easily supporting other data formats or adding behaviors (encrypt, compress...).
1import { DataFile } from 'lowdb'
2new DataFile(filename, {
3 parse: YAML.parse,
4 stringify: YAML.stringify
5})
6new DataFile(filename, {
7 parse: (data) => { decypt(JSON.parse(data)) },
8 stringify: (str) => { encrypt(JSON.stringify(str)) }
9})
If you've published an adapter for lowdb, feel free to create a PR to add it here.
You may want to create an adapter to write db.data
to YAML, XML, encrypt data, a remote storage, ...
An adapter is a simple class that just needs to expose two methods:
1class AsyncAdapter { 2 read() { 3 /* ... */ 4 } // should return Promise<data> 5 write(data) { 6 /* ... */ 7 } // should return Promise<void> 8} 9 10class SyncAdapter { 11 read() { 12 /* ... */ 13 } // should return data 14 write(data) { 15 /* ... */ 16 } // should return nothing 17}
For example, let's say you have some async storage and want to create an adapter for it:
1import { api } from './AsyncStorage' 2 3class CustomAsyncAdapter { 4 // Optional: your adapter can take arguments 5 constructor(args) { 6 // ... 7 } 8 9 async read() { 10 const data = await api.read() 11 return data 12 } 13 14 async write(data) { 15 await api.write(data) 16 } 17} 18 19const adapter = new CustomAsyncAdapter() 20const db = new Low(adapter)
See src/adapters/
for more examples.
To create an adapter for another format than JSON, you can use TextFile
or TextFileSync
.
For example:
1import { Adapter, Low } from 'lowdb' 2import { TextFile } from 'lowdb/node' 3import YAML from 'yaml' 4 5class YAMLFile { 6 constructor(filename) { 7 this.adapter = new TextFile(filename) 8 } 9 10 async read() { 11 const data = await this.adapter.read() 12 if (data === null) { 13 return null 14 } else { 15 return YAML.parse(data) 16 } 17 } 18 19 write(obj) { 20 return this.adapter.write(YAML.stringify(obj)) 21 } 22} 23 24const adapter = new YAMLFile('file.yaml') 25const db = new Low(adapter)
Lowdb doesn't support Node's cluster module.
If you have large JavaScript objects (~10-100MB
) you may hit some performance issues. This is because whenever you call db.write
, the whole db.data
is serialized using JSON.stringify
and written to storage.
Depending on your use case, this can be fine or not. It can be mitigated by doing batch operations and calling db.write
only when you need it.
If you plan to scale, it's highly recommended to use databases like PostgreSQL or MongoDB instead.
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
no dangerous workflow patterns detected
Reason
license file detected
Details
Reason
4 existing vulnerabilities detected
Details
Reason
dependency not pinned by hash detected -- score normalized to 3
Details
Reason
Found 3/30 approved changesets -- score normalized to 1
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
project is not fuzzed
Details
Reason
security policy file not detected
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Score
Last Scanned on 2025-05-05
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More