Gathering detailed insights and metrics for @aws-sdk/client-sso
Gathering detailed insights and metrics for @aws-sdk/client-sso
Gathering detailed insights and metrics for @aws-sdk/client-sso
Gathering detailed insights and metrics for @aws-sdk/client-sso
Modularized AWS SDK for JavaScript.
npm install @aws-sdk/client-sso
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
3,129 Stars
8,216 Commits
580 Forks
45 Watching
12 Branches
180 Contributors
Updated on 27 Nov 2024
Minified
Minified + Gzipped
TypeScript (99.64%)
Java (0.24%)
JavaScript (0.1%)
Gherkin (0.02%)
Cumulative downloads
Total Downloads
Last day
2.6%
4,737,967
Compared to previous day
Last week
4.9%
24,450,173
Compared to previous week
Last month
13%
99,425,676
Compared to previous month
Last year
89.7%
920,503,305
Compared to previous year
38
6
The AWS SDK for JavaScript v3 is a rewrite of v2 with some great new features. As with version 2, it enables you to easily work with Amazon Web Services, but has a modular architecture with a separate package for each service. It also includes many frequently requested features, such as a first-class TypeScript support and a new middleware stack. For more details, visit blog post on general availability of Modular AWS SDK for JavaScript.
To get started with JavaScript SDK version 3, visit our Developer Guide or API Reference.
If you are starting a new project with AWS SDK for JavaScript v3, then you can refer aws-sdk-js-notes-app which shows examples of calling multiple AWS Services in a note taking application. If you are migrating from v2 to v3, then you can visit our self-guided workshop which builds as basic version of note taking application using AWS SDK for JavaScript v2 and provides step-by-step migration instructions to v3.
To test your universal JavaScript code in Node.js, browser and react-native environments, visit our code samples repo.
Performance is crucial for the AWS SDK for JavaScript because it directly impacts the user experience. Please refer to Performance section to know more.
Let’s walk through setting up a project that depends on DynamoDB from the SDK and makes a simple service call. The following steps use yarn as an example. These steps assume you have Node.js and yarn already installed.
Create a new Node.js project.
Inside of the project, run: yarn add @aws-sdk/client-dynamodb
. Adding packages results in update in lock file, yarn.lock or package-lock.json. You should commit your lock file along with your code to avoid potential breaking changes.
Create a new file called index.js, create a DynamoDB service client and send a request.
1const { DynamoDBClient, ListTablesCommand } = require("@aws-sdk/client-dynamodb"); 2 3(async () => { 4 const client = new DynamoDBClient({ region: "us-west-2" }); 5 const command = new ListTablesCommand({}); 6 try { 7 const results = await client.send(command); 8 console.log(results.TableNames.join("\n")); 9 } catch (err) { 10 console.error(err); 11 } 12})();
If you want to use non-modular (v2-like) interfaces, you can import client with only the service name (e.g DynamoDB), and call the operation name directly from the client:
1const { DynamoDB } = require("@aws-sdk/client-dynamodb"); 2 3(async () => { 4 const client = new DynamoDB({ region: "us-west-2" }); 5 try { 6 const results = await client.listTables({}); 7 console.log(results.TableNames.join("\n")); 8 } catch (err) { 9 console.error(err); 10 } 11})();
If you use tree shaking to reduce bundle size, using non-modular interface will increase the bundle size as compared to using modular interface.
If you are consuming modular AWS SDK for JavaScript on react-native environments, you will need to add and import following polyfills in your react-native application:
1import "react-native-get-random-values"; 2import "react-native-url-polyfill/auto"; 3import "web-streams-polyfill/dist/polyfill"; 4 5import { DynamoDB } from "@aws-sdk/client-dynamodb";
Specifically Metro bundler used by react-native, enable Package Exports Support:
The SDK is now split up across multiple packages. The 2.x version of the SDK contained support for every service. This made it very easy to use multiple services in a project. Due to the limitations around reducing the size of the SDK when only using a handful of services or operations, many customers requested having separate packages for each service client. We have also split up the core parts of the SDK so that service clients only pull in what they need. For example, a service sends responses in JSON will no longer need to also have an XML parser as a dependency.
For those that were already importing services as sub-modules from the v2 SDK, the import statement doesn’t look too different. Here’s an example of importing the AWS Lambda service in v2 of the SDK, and the v3 SDK:
1// import the Lambda client constructor in v2 of the SDK 2const Lambda = require("aws-sdk/clients/lambda"); 3 4// import the Lambda client constructor in v3 SDK 5const { Lambda } = require("@aws-sdk/client-lambda");
It is also possible to import both versions of the Lambda client by changing the variable name the Lambda constructor is stored in.
We’ve made several public API changes to improve consistency, make the SDK easier to use, and remove deprecated or confusing APIs. The following are some of the big changes included in the new AWS SDK for JavaScript v3.
In version 2.x of the SDK, service configuration could be passed to individual client constructors.
However, these configurations would first be merged automatically into a copy of the global SDK configuration: AWS.config
.
Also, calling AWS.config.update({/* params */})
only updated configuration for service clients instantiated after the update call was made, not any existing clients.
This behavior was a frequent source of confusion, and made it difficult to add configuration to the global object that only affects a subset of service clients in a forward-compatible way. In v3, there is no longer a global configuration managed by the SDK. Configuration must be passed to each service client that is instantiated. It is still possible to share the same configuration across multiple clients but that configuration will not be automatically merged with a global state.
Version 2.x of the SDK allows modifying a request throughout multiple stages of a request’s lifecycle by attaching event listeners to a request. Some feedback we received frequently was that it can be difficult to debug what went wrong during a request’s lifecycle. We’ve switched to using a middleware stack to control the lifecycle of an operation call now. This gives us a few benefits. Each middleware in the stack calls the next middleware after making any changes to the request object. This also makes debugging issues in the stack much easier since you can see exactly which middleware have been called leading up to an error. Here’s an example of logging requests using middleware:
1const client = new DynamoDB({ region: "us-west-2" }); 2 3client.middlewareStack.add( 4 (next, context) => async (args) => { 5 console.log("AWS SDK context", context.clientName, context.commandName); 6 console.log("AWS SDK request input", args.input); 7 const result = await next(args); 8 console.log("AWS SDK request output:", result.output); 9 return result; 10 }, 11 { 12 name: "MyMiddleware", 13 step: "build", 14 override: true, 15 } 16); 17 18await client.listTables({});
In the above example, we’re adding a middleware to our DynamoDB client’s middleware stack. The first argument is a function that accepts next, the next middleware in the stack to call, and context, an object that contains some information about the operation being called. It returns a function that accepts args, an object that contains the parameters passed to the operation and the request, and returns the result from calling the next middleware with args.
If you are looking for a breakdown of the API changes from AWS SDK for JavaScript v2 to v3, we have them listed in UPGRADING.md.
The Lambda provided AWS SDK is set to a specific minor version, and NOT the latest version. To check the minor version used by Lambda, please refer to Lambda runtimes doc page. If you wish to use the latest / different version of the SDK from the one provided by lambda, we recommend that you bundle and minify your project, or upload it as a Lambda layer.
The performance of the AWS SDK for JavaScript v3 on node 18 has improved from v2 as seen in the performance benchmarking
When using Lambda we should use a single SDK client per service, per region, and initialize it outside of the handler's codepath. This is done to optimize for Lambda's container reuse.
The API calls themselves should be made from within the handler's codepath. This is done to ensure that API calls are signed at the very last step of Lambda's execution cycle, after the Lambda is "hot" to avoid signing time skew.
Example:
1import { STSClient, GetCallerIdentityCommand } from "@aws-sdk/client-sts"; 2 3const client = new STSClient({}); // SDK Client initialized outside the handler 4 5export const handler = async (event) => { 6 const response = { 7 statusCode: 200, 8 headers: { 9 "Content-Type": "application/json", 10 }, 11 }; 12 13 try { 14 const results = await client.send(new GetCallerIdentityCommand({})); // API operation made from within the handler 15 const responseBody = { 16 userId: results.UserId, 17 }; 18 19 response.body = JSON.stringify(responseBody); 20 } catch (err) { 21 console.log("Error:", err); 22 response.statusCode = 500; 23 response.body = JSON.stringify({ 24 message: "Internal Server Error", 25 }); 26 } 27 28 return response; 29};
Please refer to supplemental docs on performance to know more.
All clients have been published to NPM and can be installed as described above. If you want to play with latest clients, you can build from source as follows:
Clone this repository to local by:
git clone https://github.com/aws/aws-sdk-js-v3.git
Under the repository root directory, run following command to link and build the whole library, the process may take several minutes:
yarn && yarn test:all
For more information, please refer to contributing guide.
After the repository is successfully built, change directory to the client that you want to install, for example:
cd clients/client-dynamodb
Pack the client:
yarn pack .
yarn pack
will create an archive file in the client package folder, e.g. aws-sdk-client-dynamodb-v3.0.0.tgz
.
Change directory to the project you are working on and move the archive to the location to store the vendor packages:
mv path/to/aws-sdk-js-v3/clients/client-dynamodb/aws-sdk-client-dynamodb-v3.0.0.tgz ./path/to/vendors/folder
Install the package to your project:
yarn add ./path/to/vendors/folder/aws-sdk-client-dynamodb-v3.0.0.tgz
You can provide feedback to us in several ways. Both positive and negative feedback is appreciated. If you do, please feel free to open an issue on our GitHub repository. Our GitHub issues page also includes work we know still needs to be done to reach full feature parity with v2 SDK.
GitHub issues. Customers who are comfortable giving public feedback can open a GitHub issue in the new repository. This is the preferred mechanism to give feedback so that other customers can engage in the conversation, +1 issues, etc. Issues you open will be evaluated, and included in our roadmap for the GA launch.
Gitter channel. For informal discussion or general feedback, you may join the Gitter chat. The Gitter channel is also a great place to get help with v3 from other developers. JS SDK team doesn't track the discussion daily, so feel free to open a GitHub issue if your question is not answered there.
You can open pull requests for fixes or additions to the new AWS SDK for JavaScript v3. All pull requests must be submitted under the Apache 2.0 license and will be reviewed by an SDK team member prior to merging. Accompanying unit tests are appreciated. See Contributing for more information.
This is an introduction to some of the high level concepts behind AWS SDK for JavaScript (v3) which are shared between services and might make your life easier. Please consult the user guide and API reference for service specific details.
Bare-bones clients/commands: This refers to a modular way of consuming individual operations on JS SDK clients. It results in less code being imported and thus more performant. It is otherwise equivalent to the aggregated clients/commands.
1// this imports a bare-bones version of S3 that exposes the .send operation 2import { S3Client } from "@aws-sdk/client-s3" 3 4// this imports just the getObject operation from S3 5import { GetObjectCommand } from "@aws-sdk/client-s3" 6 7//usage 8const bareBonesS3 = new S3Client({...}); 9await bareBonesS3.send(new GetObjectCommand({...}));
Aggregated clients/commands: This refers to a way of consuming clients that contain all operations on them. Under the hood this calls the bare-bones commands. This imports all commands on a particular client and results in more code being imported and thus less performant. This is 1:1 with v2's style.
1// this imports an aggregated version of S3 that exposes the .send operation 2import { S3 } from "@aws-sdk/client-s3" 3 4// No need to import an operation as all operations are already on the S3 prototype 5 6//usage 7const aggregatedS3 = new S3({...}); 8await aggregatedS3.getObject({...}));
The v3 codebase is generated from internal AWS models that AWS services expose. We use smithy-typescript to generate all code in the /clients
subdirectory. These packages always have a prefix of @aws-sdk/client-XXXX
and are one-to-one with AWS services and service operations. You should be importing @aws-sdk/client-XXXX
for most usage.
Clients depend on common "utility" code in /packages
. The code in /packages
is manually written and outside of special cases (like credentials or abort controller) is generally not very useful alone.
Lastly, we have higher level libraries in /lib
. These are javascript specific libraries that wrap client operations to make them easier to work with. Popular examples are @aws-sdk/lib-dynamodb
which simplifies working with items in Amazon DynamoDB or @aws-sdk/lib-storage
which exposes the Upload
function and simplifies parallel uploads in S3's multipartUpload.
/packages
. This sub directory is where most manual code updates are done. These are published to NPM under @aws-sdk/XXXX
and have no special prefix./clients
. This sub directory is code generated and depends on code published from /packages
. It is 1:1 with AWS services and operations. Manual edits should generally not occur here. These are published to NPM under @aws-sdk/client-XXXX
./lib
. This sub directory depends on generated code published from /clients
. It wraps existing AWS services and operations to make them easier to work with in Javascript. These are published to NPM under @aws-sdk/lib-XXXX
Certain command outputs include streams, which have different implementations in
Node.js and browsers. For convenience, a set of stream handling methods will be
merged (Object.assign
) to the output stream object, as defined in
SdkStreamMixin.
Output types having this feature will be indicated by the WithSdkStreamMixin<T, StreamKey>
wrapper type, where T
is the original output type
and StreamKey
is the output property key having a stream type specific to
the runtime environment.
Here is an example using S3::GetObject
.
1import { S3 } from "@aws-sdk/client-s3"; 2 3const client = new S3({}); 4 5const getObjectResult = await client.getObject({ 6 Bucket: "...", 7 Key: "...", 8}); 9 10// env-specific stream with added mixin methods. 11const bodyStream = getObjectResult.Body; 12 13// one-time transform. 14const bodyAsString = await bodyStream.transformToString(); 15 16// throws an error on 2nd call, stream cannot be rewound. 17const __error__ = await bodyStream.transformToString();
Note that these methods will read the stream in order to collect it, so you must save the output. The methods cannot be called more than once on a stream.
Many AWS operations return paginated results when the response object is too large to return in a single response. In AWS SDK for JavaScript v2, the response contains a token you can use to retrieve the next page of results. You then need to write additional functions to process pages of results.
In AWS SDK for JavaScript v3, we’ve improved pagination using async generator functions, which are similar to generator functions, with the following differences:
next
, throw
, and return
) return promises for {
value,
done }
, instead of directly returning {
value,
done }
. This automatically makes the returned async generator objects async iterators.for await (x of y)
statements are allowed.yield*
is modified to support delegation to async iterables.The Async Iterators were added in the ES2018 iteration of JavaScript. They are supported by Node.js 10.x+ and by all modern browsers, including Chrome 63+, Firefox 57+, Safari 11.1+, and Edge 79+. If you’re using TypeScript v2.3+, you can compile Async Iterators to older versions of JavaScript.
An async iterator is much like an iterator, except that its next()
method returns a promise for a {
value,
done }
pair. As an implicit aspect of the Async Iteration protocol, the next promise is not requested until the previous one resolves. This is a simple, yet a very powerful pattern.
In v3, the clients expose paginateOperationName APIs that are written using async generators, allowing you to use async iterators in a for await..of loop. You can perform the paginateListTables operation from @aws-sdk/client-dynamodb
as follows:
1const { 2 DynamoDBClient, 3 paginateListTables, 4} = require("@aws-sdk/client-dynamodb"); 5 6... 7const paginatorConfig = { 8 client: new DynamoDBClient({}), 9 pageSize: 25 10}; 11const commandParams = {}; 12const paginator = paginateListTables(paginatorConfig, commandParams); 13 14const tableNames = []; 15for await (const page of paginator) { 16 // page contains a single paginated output. 17 tableNames.push(...page.TableNames); 18} 19... 20
Or simplified:
1... 2const client = new DynamoDBClient({}); 3 4const tableNames = []; 5for await (const page of paginateListTables({ client }, {})) { 6 // page contains a single paginated output. 7 tableNames.push(...page.TableNames); 8} 9...
In v3, we support the AbortController interface which allows you to abort requests as and when desired.
The AbortController Interface provides an abort()
method that toggles the state of a corresponding AbortSignal object. Most APIs accept an AbortSignal object, and respond to abort()
by rejecting any unsettled promise with an “AbortError”.
1// Returns a new controller whose signal is set to a newly created AbortSignal object. 2const controller = new AbortController(); 3 4// Returns the AbortSignal object associated with controller. 5const signal = controller.signal; 6 7// Invoking this method will set controller’s AbortSignal's aborted flag 8// and signal to any observers that the associated activity is to be aborted. 9controller.abort();
In JavaScript SDK v3, we added an implementation of WHATWG AbortController interface in @aws-sdk/abort-controller
. To use it, you need to send AbortController.signal
as abortSignal
in the httpOptions parameter when calling .send()
operation on the client as follows:
1const { AbortController } = require("@aws-sdk/abort-controller"); 2const { S3Client, CreateBucketCommand } = require("@aws-sdk/client-s3"); 3 4... 5 6const abortController = new AbortController(); 7const client = new S3Client(clientParams); 8 9const requestPromise = client.send(new CreateBucketCommand(commandParams), { 10 abortSignal: abortController.signal, 11}); 12 13// The abortController can be aborted any time. 14// The request will not be created if abortSignal is already aborted. 15// The request will be destroyed if abortSignal is aborted before response is returned. 16abortController.abort(); 17 18// This will fail with "AbortError" as abortSignal is aborted. 19await requestPromise;
The following code snippet shows how to upload a file using S3's putObject API in the browser with support to abort the upload. First, create a controller using the AbortController()
constructor, then grab a reference to its associated AbortSignal object using the AbortController.signal property. When the PutObjectCommand
is called with .send()
operation, pass in AbortController.signal as abortSignal in the httpOptions parameter. This will allow you to abort the PutObject operation by calling abortController.abort()
.
1const abortController = new AbortController(); 2const abortSignal = abortController.signal; 3 4const uploadBtn = document.querySelector('.upload'); 5const abortBtn = document.querySelector('.abort'); 6 7uploadBtn.addEventListener('click', uploadObject); 8 9abortBtn.addEventListener('click', function() { 10 abortController.abort(); 11 console.log('Upload aborted'); 12}); 13 14const uploadObject = async (file) => { 15 ... 16 const client = new S3Client(clientParams); 17 try { 18 await client.send(new PutObjectCommand(commandParams), { abortSignal }); 19 } catch(e) { 20 if (e.name === "AbortError") { 21 uploadProgress.textContent = 'Upload aborted: ' + e.message; 22 } 23 ... 24 } 25}
For a full abort controller deep dive, please check out our blog post.
The AWS SDK for JavaScript (v3) maintains a series of asynchronous actions. These series include actions that serialize input parameters into the data over the wire and deserialize response data into JavaScript objects. Such actions are implemented using functions called middleware and executed in a specific order. The object that hosts all the middleware including the ordering information is called a Middleware Stack. You can add your custom actions to the SDK and/or remove the default ones.
When an API call is made, SDK sorts the middleware according to the step it belongs to and its priority within each step. The input parameters pass through each middleware. An HTTP request gets created and updated along the process. The HTTP Handler sends a request to the service, and receives a response. A response object is passed back through the same middleware stack in reverse, and is deserialized into a JavaScript object.
A middleware is a higher-order function that transfers user input and/or HTTP request, then delegates to “next” middleware. It also transfers the result from “next” middleware. A middleware function also has access to context parameter, which optionally contains data to be shared across middleware.
For example, you can use middleware to log or modify a request:
1const { S3 } = require("@aws-sdk/client-s3"); 2const client = new S3({ region: "us-west-2" }); 3 4// Middleware added to client, applies to all commands. 5client.middlewareStack.add( 6 (next, context) => async (args) => { 7 args.request.headers["x-amz-meta-foo"] = "bar"; 8 console.log("AWS SDK context", context.clientName, context.commandName); 9 console.log("AWS SDK request input", args.input); 10 const result = await next(args); 11 console.log("AWS SDK request output:", result.output); 12 return result; 13 }, 14 { 15 step: "build", 16 name: "addFooMetadataMiddleware", 17 tags: ["METADATA", "FOO"], 18 override: true, 19 } 20); 21 22await client.putObject(params);
Specifying the absolute location of your middleware
The example above adds middleware to build
step of middleware stack. The middleware stack contains five steps to manage a request’s lifecycle:
args.request
.Content-Length
or a body checksum. Any request alterations will be applied to all retries.result.output
.
Each middleware must be added to a specific step. By default, each middleware in the same step has undifferentiated order. In some cases, you might want to execute a middleware before or after another middleware in the same step. You can achieve it by specifying its priority
.1client.middlewareStack.add(middleware, { 2 name: "MyMiddleware", 3 step: "initialize", 4 priority: "high", // or "low". 5 override: true, // provide both a name and override=true to avoid accidental middleware duplication. 6});
For a full middleware stack deep dive, please check out our blog post.
Our releases usually happen once per weekday. Each release increments the minor version, e.g. 3.200.0 -> 3.201.0.
v3.201.0 and higher requires Node.js >= 14.
v3.46.0 to v3.200.0 requires Node.js >= 12.
Earlier versions require Node.js >= 10.
Package name | containing folder | API controlled by | stability |
---|---|---|---|
@aws-sdk/client-* Commands | clients | AWS service teams | public/stable |
@aws-sdk/client-* Clients | clients | AWS SDK JS team | public/stable |
@aws-sdk/lib-* | lib | AWS SDK JS team | public/stable |
@aws-sdk/*-signer | packages | AWS SDK JS team | public/stable |
@aws-sdk/middleware-stack | packages | AWS SDK JS team | public/stable |
remaining @aws-sdk/* | packages | AWS SDK JS team | internal |
Public interfaces are marked with the @public
annotation in source code and appear
in our API Reference.
Additional notes:
@internal
does not mean a package or interface is constantly changing
or being actively worked on. It means it is subject to change without any
notice period. The changes are included in the release notes.All supported interfaces are provided at the package level, e.g.:
1import { S3Client } from "@aws-sdk/client-s3"; // Yes, do this. 2 3import { S3Client } from "@aws-sdk/client-s3/dist-cjs/S3Client"; // No, don't do this.
Do not import from a deep path in any package, since the file structure may change, and
in the future packages may include the exports
metadata in package.json
preventing
access to the file structure.
This SDK has optional functionality that requires the AWS Common Runtime (CRT) bindings to be included as a dependency with your application. This functionality includes:
If the required AWS Common Runtime components are not installed, you will receive an error like:
1Cannot find module '@aws-sdk/signature-v4-crt' 2... 3Please check whether you have installed the "@aws-sdk/signature-v4-crt" package explicitly. 4You must also register the package by calling [require("@aws-sdk/signature-v4-crt");] 5or an ESM equivalent such as [import "@aws-sdk/signature-v4-crt";]. 6For more information please go to 7https://github.com/aws/aws-sdk-js-v3#functionality-requiring-aws-common-runtime-crt"
indicating that the required dependency is missing to use the associated functionality. To install this dependency, follow the provided instructions.
You can install the CRT dependency with different commands depending on the package management tool you are using. If you are using NPM:
1npm install @aws-sdk/signature-v4-crt
If you are using Yarn:
1yarn add @aws-sdk/signature-v4-crt
Additionally, load the signature-v4-crt package by importing it.
1require("@aws-sdk/signature-v4-crt"); 2// or ESM 3import "@aws-sdk/signature-v4-crt";
Only the import statement is needed. The implementation then registers itself with @aws-sdk/signature-v4-multi-region
and becomes available for its use. You do not need to use any imported objects directly.
No vulnerabilities found.
Reason
30 commit(s) and 3 issue activity found in the last 90 days -- score normalized to 10
Reason
no dangerous workflow patterns detected
Reason
license file detected
Details
Reason
security policy file detected
Details
Reason
binaries present in source code
Details
Reason
Found 2/30 approved changesets -- score normalized to 0
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
Project has not signed or included provenance with any releases.
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Reason
dependency not pinned by hash detected -- score normalized to 0
Details
Reason
project is not fuzzed
Details
Reason
14 existing vulnerabilities detected
Details
Score
Last Scanned on 2024-11-25
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More@aws-sdk/client-sso-oidc
AWS SDK for JavaScript Sso Oidc Client for Node.js, Browser and React Native
@aws-sdk/client-sso-admin
AWS SDK for JavaScript Sso Admin Client for Node.js, Browser and React Native
@aws-sdk/credential-provider-sso
AWS credential provider that exchanges a resolved SSO login token file for temporary AWS credentials
@aws-sdk/token-providers
A collection of token providers