Gathering detailed insights and metrics for @stoplight/json-schema-merge-allof
Gathering detailed insights and metrics for @stoplight/json-schema-merge-allof
Gathering detailed insights and metrics for @stoplight/json-schema-merge-allof
Gathering detailed insights and metrics for @stoplight/json-schema-merge-allof
json-schema-merge-allof
Simplify your schema by combining allOf into the root schema, safely.
@stoplight/json-schema-sampler
Tool for generation samples based on JSON Schema Draft 7
@types/json-schema-merge-allof
TypeScript definitions for json-schema-merge-allof
allof-merge
Simplify JsonSchema/Openapi by combining allOf safely
npm install @stoplight/json-schema-merge-allof
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
154 Commits
2 Forks
2 Watching
12 Branches
55 Contributors
Updated on 02 Jan 2024
JavaScript (100%)
Cumulative downloads
Total Downloads
Last day
-14.4%
21,059
Compared to previous day
Last week
5.2%
124,651
Compared to previous week
Last month
10.2%
526,718
Compared to previous month
Last year
72.1%
4,861,673
Compared to previous year
Merge schemas combined using allOf into a more readable composed schema free from allOf.
1npm install json-schema-merge-allof --save
Since allOf require ALL schemas provided (including the parent schema) to apply, we can iterate over all the schemas, extracting all the values for say, type, and find the intersection of valid values. Here is an example:
1{ 2 type: ['object', 'null'], 3 additionalProperties: { 4 type: 'string', 5 minLength: 5 6 }, 7 allOf: [{ 8 type: ['array', 'object'], 9 additionalProperties: { 10 type: 'string', 11 minLength: 10, 12 maxLength: 20 13 } 14 }] 15}
This result in the schema :
1{ 2 type: 'object', 3 additionalProperties: { 4 type: 'string', 5 minLength: 10, 6 maxLength: 20 7 } 8}
Notice that type now excludes null and array since those are not logically possible. Also minLength is raised to 10. The other properties have no conflict and are merged into the root schema with no resolving needed.
For other keywords other methods are used, here are some simple examples:
As you can see above the strategy is to choose the most restrictive of the set of values that conflict. For some keywords that is done by intersection, for others like required it is done by a union of all the values, since that is the most restrictive.
What you are left with is a schema completely free of allOf. Except for in a couple of values that are impossible to properly intersect/combine:
When multiple conflicting not values are found, we also use the approach that pattern use, but instead of allOf we use anyOf. When extraction of common rules from anyOf is in place this can be further simplified.
ignoreAdditionalProperties default false
Allows you to combine schema properties even though some schemas have additionalProperties: false
This is the most common issue people face when trying to expand schemas using allOf and a limitation of the json schema spec. Be aware though that the schema produced will allow more than the original schema. But this is useful if just want to combine schemas using allOf as if additionalProperties wasn't false during the merge process. The resulting schema will still get additionalProperties set to false.
resolvers Object Override any default resolver like this:
1mergeAllOf(schema, {
2 resolvers: {
3 title: function(values, path, mergeSchemas, options) {
4 // choose what title you want to be used based on the conflicting values
5 // resolvers MUST return a value other than undefined
6 }
7 }
8})
deep boolean, default true
If false, resolves only the top-level allOf
keyword in the schema.
If true, resolves all allOf
keywords in the schema.
The function is passed:
No separate resolver is called for patternProperties and additionalProperties, only the properties resolver is called. Same for additionalItems, only items resolver is called. This is because those keywords need to be resolved together as they affect each other.
Those two resolvers are expected to return an object containing the resolved values of all the associated keywords. The keys must be the name of the keywords. So the properties resolver need to return an object like this containing the resolved values for each keyword:
1{ 2 properties: ..., 3 patternProperties: ..., 4 additionalProperties: ..., 5}
Also the resolve function is not passed mergeSchemas, but an object mergers that contains mergers for each of the related keywords. So properties get passed an object like this:
1var mergers = { 2 properties: function mergeSchemas(schemas, childSchemaName){...}, 3 patternProperties: function mergeSchemas(schemas, childSchemaName){...}, 4 additionalProperties: function mergeSchemas(schemas){...}, 5}
Some of the mergers requires you to supply a string of the name or index of the subschema you are currently merging. This is to make sure the path passed to child resolvers are correct.
You can set a default resolver that catches any unknown keyword. Let's say you want to use the same strategy as the ones for the meta keywords, to use the first value found. You can accomplish that like this:
1mergeJsonSchema({
2 ...
3}, {
4 resolvers: {
5 defaultResolver: mergeJsonSchema.options.resolvers.title
6 }
7})
Resolvers are called whenever multiple conflicting values are found on the same position in the schemas.
You can override a resolver by supplying it in the options.
All built in reducers for validation keywords are lossless, meaning that they don't remove or add anything in terms of validation.
For meta keywords like title, description, $id, $schema, default the strategy is to use the first possible value if there are conflicting ones. So the root schema is prioritized. This process possibly removes some meta information from your schema. So it's lossy. Override this by providing custom resolvers.
If one of your schemas contain a $ref property you should resolve them using a ref resolver like json-schema-ref-parser to dereference your schema for you first. Resolving $refs is not the task of this library.
There exists some libraries that claim to merge schemas combined with allOf, but they just merge schemas using a very basic logic. Basically just the same as lodash merge. So you risk ending up with a schema that allows more or less than the original schema would allow.
We cannot merge schemas that are a logical impossibility, like:
1{ 2 type: 'object', 3 allOf: [{ 4 type: 'array' 5 }] 6}
The library will then throw an error reporting the values that had no valid intersection. But then again, your original schema wouldn't validate anything either.
Create tests for new functionality and follow the eslint rules.
To create a release:
master
branch.master
branch with upstreamgit tag <*.*.*>
to tag a new version, following semver guidelines for bumping the version. For example: git tag 0.7.6
git push origin <*.*.*>
After you push the tag, the circleci workflow will publish to npm.
MIT © Martin Hansen
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
Found 9/21 approved changesets -- score normalized to 4
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
project is not fuzzed
Details
Reason
license file not detected
Details
Reason
security policy file not detected
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Reason
54 existing vulnerabilities detected
Details
Score
Last Scanned on 2024-11-18
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More