Gathering detailed insights and metrics for dexie-export-import
Gathering detailed insights and metrics for dexie-export-import
Gathering detailed insights and metrics for dexie-export-import
Gathering detailed insights and metrics for dexie-export-import
A Minimalistic Wrapper for IndexedDB
npm install dexie-export-import
Typescript
Module System
Node Version
NPM Version
88.3
Supply Chain
99.6
Quality
83.5
Maintenance
100
Vulnerability
100
License
JavaScript (53.7%)
TypeScript (45.34%)
Shell (0.53%)
HTML (0.43%)
Total Downloads
1,076,040
Last Day
428
Last Week
4,944
Last Month
30,984
Last Year
392,552
11,841 Stars
2,510 Commits
641 Forks
110 Watching
151 Branches
92 Contributors
Latest Version
4.1.4
Package Id
dexie-export-import@4.1.4
Unpacked Size
1.01 MB
Size
229.66 kB
File Count
16
NPM Version
10.8.2
Node Version
20.16.0
Publised On
15 Nov 2024
Cumulative downloads
Total Downloads
Last day
-67.6%
428
Compared to previous day
Last week
-30.8%
4,944
Compared to previous week
Last month
0.1%
30,984
Compared to previous month
Last year
7.9%
392,552
Compared to previous year
1
Export / Import IndexedDB <---> Blob
This module extends 'dexie' module with new methods for importing / exporting databases to / from blobs.
npm install dexie
npm install dexie-export-import
Here's the basic usage. There's a lot you can do by supplying optional [options]
arguments. The available options are described later on in this README (See Typescript interfaces below).
NOTE: Typescript users using dexie@2.x will get compilation errors if using the static import method Dexie.import()
.
1import Dexie from "dexie"; 2import "dexie-export-import"; 3 4// 5// Import from Blob or File to Dexie instance: 6// 7const db = await Dexie.import(blob, [options]); 8 9// 10// Export to Blob 11// 12const blob = await db.export([options]); 13 14// 15// Import from Blob or File to existing Dexie instance 16// 17await db.import(blob, [options]); 18
Here's a working sample on CodePen. It uses downloadjs to deliver the blob as a "file download" to the user. For receiving an import file, it uses a drop area where you can drop your JSON file. Click the Console tab in the bottom to see what progressCallbacks receive.
Even though this sample doesn't show it, blobs can also be sent or retrieved to/from a server, using the fetch API.
Product | Supported versions |
---|---|
dexie | ^2.0.4 or ^3.0.0-alpha.5 |
Safari | ^10.1 |
IE | 11 |
Edge | any version |
Chrome | any version |
FF | any version |
Much smaller in size, but also much lighter than dexie-export-import.
Indexeddb-export-import can be better choice if:
Dexie-export-import was build to scale when exporting large databases without consuming much RAM. It does also support importing/exporting exotic types.
Importing this module will extend Dexie and Dexie.prototype as follows. Even though this is conceptually a Dexie.js addon, there is no addon instance. Extended interface is done into Dexie and Dexie.prototype as a side effect when importing the module.
1// 2// Extend Dexie interface (typescript-wise) 3// 4declare module 'dexie' { 5 // Extend methods on db 6 interface Dexie { 7 export(options?: ExportOptions): Promise<Blob>; 8 import(blob: Blob, options?: ImportOptions): Promise<void>; 9 } 10 interface DexieConstructor { 11 import(blob: Blob, options?: StaticImportOptions): Promise<Dexie>; 12 } 13}
These are the interfaces of the options
optional arguments to Dexie.import() and Dexie.prototype.import(). All options are optional and defaults to undefined (falsy).
1export interface StaticImportOptions { 2 noTransaction?: boolean; 3 chunkSizeBytes?: number; // Default: DEFAULT_KILOBYTES_PER_CHUNK ( 1MB ) 4 filter?: (table: string, value: any, key?: any) => boolean; 5 progressCallback?: (progress: ImportProgress) => boolean; 6 name?: string; 7} 8 9export interface ImportOptions extends StaticImportOptions { 10 acceptMissingTables?: boolean; 11 acceptVersionDiff?: boolean; 12 acceptNameDiff?: boolean; 13 acceptChangedPrimaryKey?: boolean; 14 overwriteValues?: boolean; 15 clearTablesBeforeImport?: boolean; 16 noTransaction?: boolean; 17 chunkSizeBytes?: number; // Default: DEFAULT_KILOBYTES_PER_CHUNK ( 1MB ) 18 filter?: (table: string, value: any, key?: any) => boolean; 19 progressCallback?: (progress: ImportProgress) => boolean; 20} 21
This is the interface sent to the progressCallback.
1export interface ImportProgress { 2 totalTables: number; 3 completedTables: number; 4 totalRows: number; 5 completedRows: number; 6 done: boolean; 7}
This is the interface of the options
optional arguments to Dexie.prototype.export(). All options are optional and defaults to undefined (falsy).
1export interface ExportOptions { 2 noTransaction?: boolean; 3 numRowsPerChunk?: number; 4 prettyJson?: boolean; 5 filter?: (table: string, value: any, key?: any) => boolean; 6 progressCallback?: (progress: ExportProgress) => boolean; 7}
This is the interface sent to the ExportOptions.progressCallback.
1export interface ExportProgress { 2 totalTables: number; 3 completedTables: number; 4 totalRows: number; 5 completedRows: number; 6 done: boolean; 7}
These are the default chunk sizes used when not specified in the options object. We allow quite large chunks, but still not that large (1MB RAM is not much even for a small device).
1const DEFAULT_KILOBYTES_PER_CHUNK = 1024; // When importing blob 2const DEFAULT_ROWS_PER_CHUNK = 2000; // When exporting db
The JSON format is described in the Typescript interface below. This JSON format is streamable as it is generated in a streaming fashion, and imported also using a streaming fashion. Therefore, it is important that the data come last in the file.
1export interface DexieExportJsonStructure { 2 formatName: 'dexie'; 3 formatVersion: 1; 4 data: { 5 databaseName: string; 6 databaseVersion: number; 7 tables: Array<{ 8 name: string; 9 schema: string; // '++id,name,age' 10 rowCount: number; 11 }>; 12 data: Array<{ // This property must be last (for streaming purpose) 13 tableName: string; 14 inbound: boolean; 15 rows: any[]; // This property must be last (for streaming purpose) 16 }>; 17 } 18}
1{ 2 "formatName": "dexie", 3 "formatVersion": 1, 4 "data": { 5 "databaseName": "dexie-export-import-basic-tests", 6 "databaseVersion": 1, 7 "tables": [ 8 { 9 "name": "outbound", 10 "schema": "", 11 "rowCount": 2 12 }, 13 { 14 "name": "inbound", 15 "schema": "++id", 16 "rowCount": 3 17 } 18 ], 19 "data": [{ 20 "tableName": "outbound", 21 "inbound": false, 22 "rows": [ 23 [ 24 1, 25 { 26 "foo": "bar" 27 } 28 ], 29 [ 30 2, 31 { 32 "bar": "foo" 33 } 34 ] 35 ] 36 },{ 37 "tableName": "inbound", 38 "inbound": true, 39 "rows": [ 40 { 41 "id": 1, 42 "date": 1, 43 "fullBlob": { 44 "type": "", 45 "data": "AAECAwQFBgcICQoLDA0ODxAREhMUFRYXGBkaGxwdHh8gISIjJCUmJygpKissLS4vMDEyMzQ1Njc4OTo7PD0+P0BBQkNERUZHSElKS0xNTk9QUVJTVFVWV1hZWltcXV5fYGFiY2RlZmdoaWprbG1ub3BxcnN0dXZ3eHl6e3x9fn+AgYKDhIWGh4iJiouMjY6PkJGSk5SVlpeYmZqbnJ2en6ChoqOkpaanqKmqq6ytrq+wsbKztLW2t7i5uru8vb6/wMHCw8TFxsfIycrLzM3Oz9DR0tPU1dbX2Nna29zd3t/g4eLj5OXm5+jp6uvs7e7v8PHy8/T19vf4+fr7/P3+/w==" 46 }, 47 "binary": { 48 "buffer": "AQID", 49 "byteOffset": 0, 50 "length": 3 51 }, 52 "text": "foo", 53 "bool": false, 54 "$types": { 55 "date": "date", 56 "fullBlob": "blob2", 57 "binary": "uint8array2", 58 "binary.buffer": "arraybuffer" 59 } 60 }, 61 { 62 "id": 2, 63 "foo": "bar" 64 }, 65 { 66 "id": 3, 67 "bar": "foo" 68 } 69 ] 70 }] 71 } 72
As Dexie can dynamically open non-Dexie IndexedDB databases, this is not an issue. Sample provided here:
1import Dexie from 'dexie'; 2import 'dexie-export-import'; 3 4async function exportDatabase(databaseName) { 5 const db = await new Dexie(databaseName).open(); 6 const blob = await db.export(); 7 return blob; 8} 9 10async function importDatabase(file) { 11 const db = await Dexie.import(file); 12 return db.backendDB(); 13}
This feature has been asked for a lot:
My simple answer initially was this:
1function export(db) { 2 return db.transaction('r', db.tables, ()=>{ 3 return Promise.all( 4 db.tables.map(table => table.toArray() 5 .then(rows => ({table: table.name, rows: rows}))); 6 }); 7} 8 9function import(data, db) { 10 return db.transaction('rw', db.tables, () => { 11 return Promise.all(data.map (t => 12 db.table(t.table).clear() 13 .then(()=>db.table(t.table).bulkAdd(t.rows))); 14 }); 15}
Looks simple!
But:
This addon solves these issues, and some more, with the help of some libraries.
To accomplish a streamable export/import, and allow exotic types, I use the libraries listed below. Note that these libraries are listed as devDependencies because they are bundles using rollupjs - so there's no real dependency from the library user persective.
These modules enables something similar as JSON.stringify() / JSON.parse() for exotic or custom types.
This module allow to read JSON in a streaming fashion
I must admit that I had to do some research before I understood how to accomplish streaming JSON from client-side Javascript (both reading / writing). It is really not obvious that this would even be possible. Looking at the Blob interface, it does not provide any way of either reading or writing in a streamable fashion.
What I found though (after some googling) was that it is indeed possible to do that based on the current DOM platform (including IE11 !).
A File or Blob represents something that can lie on a disk file and not yet be in RAM. So how do we read the first 100 bytes from a Blob without reading it all?
1const firstPart = blob.slice(0,100);
Ok, and in the next step we use a FileReader to really read this sliced Blob into memory.
1
2const first100Chars = await readBlob(firstPart);
3
4function readBlob(blob: Blob): Promise<string> {
5 return new Promise((resolve, reject) => {
6 const reader = new FileReader();
7 reader.onabort = ev => reject(new Error("file read aborted"));
8 reader.onerror = ev => reject((ev.target as any).error);
9 reader.onload = ev => resolve((ev.target as any).result);
10 reader.readAsText(blob);
11 });
12}
Voila!
But! How can we keep transactions alive when calling this non-indexedDB async call?
I use two different solutions for this:
new FileReaderSync()
instead of new FileReader()
.Dexie.waitFor()
to while reading this short elapsed chunk, keeping the transaction alive still.Ok, fine, but how do we parse the chunk then? Cannot use JSON.parse(firstPart) because it will most defenitely be incomplete.
Clarinet to the rescue. This library can read JSON and callback whenever JSON tokens come in.
Writing JSON is solved more easily. As the BlobBuilder interface was deprecated from the DOM, I firstly found this task impossible. But after digging around, I found that also this SHOULD be possible if browers implement the Blob interface correctly.
Blobs can be constructeed from an array of other Blobs. This is the key.
And that's pretty much it.
No vulnerabilities found.
No security vulnerabilities found.