Installations
npm install s3-sync-client
Developer Guide
Typescript
Yes
Module System
CommonJS
Min. Node Version
>=16.0.0
Node Version
18.15.0
NPM Version
9.2.0
Score
57.3
Supply Chain
100
Quality
75.9
Maintenance
100
Vulnerability
99.6
License
Releases
Unable to fetch releases
Contributors
Unable to fetch Contributors
Languages
TypeScript (100%)
Developer
jeanbmar
Download Statistics
Total Downloads
1,608,382
Last Day
1,607
Last Week
14,167
Last Month
72,342
Last Year
780,847
GitHub Statistics
83 Stars
70 Commits
22 Forks
2 Watching
1 Branches
4 Contributors
Bundle Size
22.84 kB
Minified
4.73 kB
Minified + Gzipped
Package Meta Information
Latest Version
4.3.1
Package Id
s3-sync-client@4.3.1
Unpacked Size
119.43 kB
Size
22.26 kB
File Count
84
NPM Version
9.2.0
Node Version
18.15.0
Publised On
03 Jul 2023
Total Downloads
Cumulative downloads
Total Downloads
1,608,382
Last day
-48.6%
1,607
Compared to previous day
Last week
-19.7%
14,167
Compared to previous week
Last month
-8.4%
72,342
Compared to previous month
Last year
28.3%
780,847
Compared to previous year
Daily Downloads
Weekly Downloads
Monthly Downloads
Yearly Downloads
Peer Dependencies
2
Dev Dependencies
16
AWS CLI s3 sync for Node.js
AWS CLI s3 sync
for Node.js is a modern TypeScript client to perform S3 sync operations between file systems and S3 buckets, in the spirit of the official AWS CLI command.
AWS CLI installation is NOT required by this module.
Features
- Sync from an S3 bucket to a local file system
- Sync from a local file system to an S3 bucket (with multipart uploads support)
- Sync from an S3 bucket to another S3 bucket
- Sync only new and updated objects
- Support AWS CLI options
--delete
,--dryrun
,--size-only
,--include
,--exclude
,--follow-symlinks
,--no-follow-symlinks
- Support AWS SDK native command input options
- Monitor sync progress
- Sync any number of objects (no 1000 objects limit)
- Transfer objects concurrently
- Manage differences in folder structures easily through relocation
Why should I use this module?
- There is no way to achieve S3 sync using the AWS SDK for JavaScript v3 alone
- AWS CLI installation is NOT required
- The module contains no external dependency
- The AWS SDK peer dependency is up-to-date (AWS SDK for JavaScript v3)
- The module overcomes a set of common limitations listed at the bottom of this README
Table of Contents
Getting Started
Install
npm i s3-sync-client
Quick Start
Client initialization
S3SyncClient
is a wrapper for the AWS SDK S3Client
class.
1import S3Client from '@aws-sdk/client-s3'; 2import { S3SyncClient } from 's3-sync-client'; 3 4const s3Client = new S3Client({ /* ... */ }); 5const { sync } = new S3SyncClient({ client: s3Client });
Sync from file system to S3 bucket
1// aws s3 sync /path/to/local/dir s3://mybucket2 2await sync('/path/to/local/dir', 's3://mybucket2'); 3await sync('/path/to/local/dir', 's3://mybucket2', { partSize: 100 * 1024 * 1024 }); // uses multipart uploads for files higher than 100MB 4 5// aws s3 sync /path/to/local/dir s3://mybucket2/zzz --delete 6await sync('/path/to/local/dir', 's3://mybucket2/zzz', { del: true });
Sync from S3 bucket to file system
1// aws s3 sync s3://mybucket /path/to/some/local --delete 2await sync('s3://mybucket', '/path/to/some/local', { del: true }); 3 4// aws s3 sync s3://mybucket2 /path/to/local/dir --dryrun 5const diff = await sync('s3://mybucket2', '/path/to/local/dir', { dryRun: true }); 6console.log(diff); // log operations to perform
Sync from S3 bucket to S3 bucket
1// aws s3 sync s3://my-source-bucket s3://my-target-bucket --delete 2await sync('s3://my-source-bucket', 's3://my-target-bucket', { del: true });
Monitor sync progress
1import { TransferMonitor } from 's3-sync-client'; 2 3const monitor = new TransferMonitor(); 4monitor.on('progress', (progress) => console.log(progress)); 5await sync('s3://mybucket', '/path/to/local/dir', { monitor }); 6 7/* output: 8... 9{ 10 size: { current: 11925, total: 35688 }, 11 count: { current: 3974, total: 10000 } 12} 13... 14*/ 15 16// to pull status info occasionally only, use monitor.getStatus(): 17const timeout = setInterval(() => console.log(monitor.getStatus()), 2000); 18try { 19 await sync('s3://mybucket', '/path/to/local/dir', { monitor }); 20} finally { 21 clearInterval(timeout); 22}
Abort sync
1import { AbortController } from '@aws-sdk/abort-controller'; 2import { TransferMonitor } from 's3-sync-client'; 3 4const abortController = new AbortController(); 5setTimeout(() => abortController.abort(), 30000); 6 7await sync('s3://mybucket', '/path/to/local/dir', { abortSignal: abortController.signal });
Use AWS SDK command input options
1import mime from 'mime-types'; 2/* 3commandInput properties can either be: 4 - a plain object 5 - a function returning a plain object 6*/ 7 8// set ACL, fixed value 9await sync('s3://mybucket', '/path/to/local/dir', { 10 commandInput: { 11 ACL: 'aws-exec-read', 12 }, 13}); 14 15// set content type, dynamic value (function) 16await sync('s3://mybucket1', 's3://mybucket2', { 17 commandInput: (input) => ({ 18 ContentType: mime.lookup(input.Key) || 'text/html', 19 }), 20});
Filter source files
1// aws s3 sync s3://my-source-bucket s3://my-target-bucket --exclude "*" --include "*.txt" --include "flowers/*" 2await sync('s3://my-source-bucket', 's3://my-target-bucket', { 3 filters: [ 4 { exclude: () => true }, // exclude everything 5 { include: (key) => key.endsWith('.txt') }, // include .txt files 6 { include: (key) => key.startsWith('flowers/') }, // also include everything inside the flowers folder 7 ], 8});
Relocate objects during sync
1// move objects from source folder a/b/ to target folder zzz/ 2await sync('s3://my-source-bucket', 's3://my-target-bucket', { 3 relocations: [ // multiple relocations can be applied 4 (currentPath) => 5 currentPath.startsWith('a/b/') 6 ? currentPath.replace('a/b/', 'zzz/') 7 : currentPath 8 ], 9}); 10 11// aws s3 sync s3://mybucket/flowers/red /path/to/local/dir 12// as in cli, folder flowers/red will be flattened during sync 13await sync('s3://mybucket/flowers/red', '/path/to/local/dir');
Note: relocations are applied after every other options such as filters.
Additional examples are available in the repo test directory.
API Reference
A complete API reference is available in the repo docs directory.
Class: S3SyncClient
new S3SyncClient(options)
options
<S3SyncClientConfig>client
<S3Client> instance of AWS SDK S3Client.
client.sync(localDir, bucketPrefix[, options])
Sync from file system to S3 bucket.
Similar to AWS CLI aws s3 sync localDir bucketPrefix [options]
.
localDir
<string> Local directorybucketPrefix
<string> Remote bucket name which may contain a prefix appended with a/
separatoroptions
<SyncBucketWithLocalOptions>dryRun
<boolean> Equivalent to CLI--dryrun
optiondel
<boolean> Equivalent to CLI--delete
optiondeleteExcluded
<boolean> Delete excluded target objects even if they match a source object. Ignored ifdel
is false. See this CLI issue.sizeOnly
<boolean> Equivalent to CLI--size-only
optionfollowSymlinks
<boolean> Equivalent to CLI--follow-symlinks
option (defaulttrue
)relocations
<Relocation[]> Allows uploading objects to remote folders without mirroring the source directory structure. Each relocation is as a callback taking a string posix path param and returning a relocated string posix path.filters
<Filter[]> Almost equivalent to CLI--exclude
and--include
options. Filters can be specified using plain objects including either aninclude
orexclude
property. Theinclude
andexclude
properties are functions that take an object key and return a boolean.abortSignal
<AbortSignal> Allows aborting the synccommandInput
<CommandInput<PutObjectCommandInput>> | <CommandInput<CreateMultipartUploadCommandInput>> Set any of the SDK <PutObjectCommandInput> or <CreateMultipartUploadCommandInput> options to uploadsmonitor
<TransferMonitor>- Attach
progress
event to receive upload progress notifications - Call
getStatus()
to retrieve progress info on demand
- Attach
maxConcurrentTransfers
<number> Each upload generates a Promise which is resolved when a local object is written to the S3 bucket. This parameter sets the maximum number of upload promises that might be running concurrently.partSize
<number> Set the part size in bytes for multipart uploads. Default to 5 MB.
- Returns: <Promise<SyncBucketWithLocalCommandOutput>> Fulfills with sync operations upon success.
client.sync(bucketPrefix, localDir[, options])
Sync from S3 bucket to file system.
Similar to AWS CLI aws s3 sync bucketPrefix localDir [options]
.
bucketPrefix
<string> Remote bucket name which may contain a prefix appended with a/
separatorlocalDir
<string> Local directoryoptions
<SyncLocalWithBucketOptions>dryRun
<boolean> Equivalent to CLI--dryrun
optiondel
<boolean> Equivalent to CLI--delete
optiondeleteExcluded
<boolean> Delete excluded target objects even if they match a source object. Ignored ifdel
is false. See this CLI issue.sizeOnly
<boolean> Equivalent to CLI--size-only
optionfollowSymlinks
<boolean> Equivalent to CLI--follow-symlinks
option (defaulttrue
)relocations
<Relocation[]> Allows downloading objects to local directories without mirroring the source folder structure. Each relocation is as a callback taking a string posix path param and returning a relocated string posix path.filters
<Filter[]> Almost equivalent to CLI--exclude
and--include
options. Filters can be specified using plain objects including either aninclude
orexclude
property. Theinclude
andexclude
properties are functions that take an object key and return a boolean.abortSignal
<AbortSignal> Allows aborting the synccommandInput
<CommandInput<GetObjectCommandInput>> Set any of the SDK <GetObjectCommandInput> options to downloadsmonitor
<TransferMonitor>- Attach
progress
event to receive download progress notifications - Call
getStatus()
to retrieve progress info on demand
- Attach
maxConcurrentTransfers
<number> Each download generates a Promise which is resolved when a remote object is written to the local file system. This parameter sets the maximum number of download promises that might be running concurrently.
- Returns: <Promise<SyncLocalWithBucketCommandOutput>> Fulfills with sync operations upon success.
client.sync(sourceBucketPrefix, targetBucketPrefix[, options])
Sync from S3 bucket to S3 bucket.
Similar to AWS CLI aws s3 sync sourceBucketPrefix targetBucketPrefix [options]
.
sourceBucketPrefix
<string> Remote reference bucket name which may contain a prefix appended with a/
separatortargetBucketPrefix
<string> Remote bucket name to sync which may contain a prefix appended with a/
separatoroptions
<SyncBucketWithBucketOptions>dryRun
<boolean> Equivalent to CLI--dryrun
optiondel
<boolean> Equivalent to CLI--delete
optiondeleteExcluded
<boolean> Delete excluded target objects even if they match a source object. Ignored ifdel
is false. See this CLI issue.sizeOnly
<boolean> Equivalent to CLI--size-only
optionrelocations
<Relocation[]> Allows copying objects to remote folders without mirroring the source folder structure. Each relocation is as a callback taking a string posix path param and returning a relocated string posix path.filters
<Filter[]> Almost equivalent to CLI--exclude
and--include
options. Filters can be specified using plain objects including either aninclude
orexclude
property. Theinclude
andexclude
properties are functions that take an object key and return a boolean.abortSignal
<AbortSignal> Allows aborting the synccommandInput
<CommandInput<CopyObjectCommandInput>> Set any of the SDK <CopyObjectCommandInput> options to copy operationsmonitor
<TransferMonitor>- Attach
progress
event to receive copy progress notifications - Call
getStatus()
to retrieve progress info on demand
- Attach
maxConcurrentTransfers
<number> Each copy generates a Promise which is resolved after the object has been copied. This parameter sets the maximum number of copy promises that might be running concurrently.
- Returns: <Promise<SyncBucketWithBucketCommandOutput>> Fulfills with sync operations upon success.
Change Log
See CHANGELOG.md.
Benchmark
AWS CLI s3 sync
for Node.js has been developed to solve the S3 sync limitations of the existing GitHub repo and NPM modules.
Most of the existing repo and NPM modules suffer one or more of the following limitations:
- requires AWS CLI to be installed
- uses Etag to perform file comparison (Etag should be considered an opaque field and shouldn't be used)
- limits S3 bucket object listing to 1000 objects
- supports syncing local to bucket, but doesn't support syncing bucket to local
- doesn't support multipart uploads
- uses outdated dependencies
- is unmaintained
The following JavaScript modules suffer at least one of the limitations:
No vulnerabilities found.
Reason
no dangerous workflow patterns detected
Reason
no binaries found in the repo
Reason
license file detected
Details
- Info: project has a license file: LICENSE:0
- Info: FSF or OSI recognized license: MIT License: LICENSE:0
Reason
7 existing vulnerabilities detected
Details
- Warn: Project is vulnerable to: GHSA-grv7-fg5c-xmjg
- Warn: Project is vulnerable to: GHSA-3xgq-45jj-v275
- Warn: Project is vulnerable to: GHSA-9c47-m6qq-7p4h
- Warn: Project is vulnerable to: GHSA-952p-6rrq-rcjv
- Warn: Project is vulnerable to: GHSA-c2qf-rxjj-qqgw
- Warn: Project is vulnerable to: GHSA-f5x3-32g6-xq36
- Warn: Project is vulnerable to: GHSA-j8xg-fqg3-53r7
Reason
dependency not pinned by hash detected -- score normalized to 2
Details
- Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/test.yml:20: update your workflow using https://app.stepsecurity.io/secureworkflow/jeanbmar/s3-sync-client/test.yml/master?enable=pin
- Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/test.yml:22: update your workflow using https://app.stepsecurity.io/secureworkflow/jeanbmar/s3-sync-client/test.yml/master?enable=pin
- Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/test.yml:26: update your workflow using https://app.stepsecurity.io/secureworkflow/jeanbmar/s3-sync-client/test.yml/master?enable=pin
- Info: 0 out of 3 GitHub-owned GitHubAction dependencies pinned
- Info: 1 out of 1 npmCommand dependencies pinned
Reason
Found 3/26 approved changesets -- score normalized to 1
Reason
detected GitHub workflow tokens with excessive permissions
Details
- Warn: no topLevel permission defined: .github/workflows/test.yml:1
- Info: no jobLevel write permissions found
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
- Warn: no security policy file detected
- Warn: no security file to analyze
- Warn: no security file to analyze
- Warn: no security file to analyze
Reason
project is not fuzzed
Details
- Warn: no fuzzer integrations found
Reason
branch protection not enabled on development/release branches
Details
- Warn: branch protection not enabled for branch 'master'
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
- Warn: 0 commits out of 21 are checked with a SAST tool
Score
3
/10
Last Scanned on 2024-12-16
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn MoreOther packages similar to s3-sync-client
s3-client
high level amazon s3 client. upload and download files and directories
@di-fe/s3
high level amazon s3 client. upload and download files and directories
@elucidatainc/s3-node-client
high level amazon s3 client. upload and download files and directories
node-s3-client
high level amazon s3 client. upload and download files and directories