Installations
npm install loopback-connector-saphana
Score
72.6
Supply Chain
97.4
Quality
77.2
Maintenance
50
Vulnerability
98.9
License
Releases
Unable to fetch releases
Developer
jensonzhao
Developer Guide
Module System
CommonJS
Min. Node Version
Typescript Support
No
Node Version
NPM Version
1.4.28
Statistics
3 Stars
46 Commits
3 Forks
4 Watching
1 Branches
1 Contributors
Updated on 28 Jan 2023
Languages
JavaScript (100%)
Total Downloads
Cumulative downloads
Total Downloads
14,085
Last day
-36.4%
21
Compared to previous day
Last week
-36.4%
105
Compared to previous week
Last month
40.5%
614
Compared to previous month
Last year
34.1%
3,471
Compared to previous year
Daily Downloads
Weekly Downloads
Monthly Downloads
Yearly Downloads
Dependencies
4
Dev Dependencies
6
loopback-connector-saphana
loopback-connector-saphana
is the SAP HANA connector module for loopback-datasource-juggler.
Connector settings
The connector can be configured using the following settings from the data source.
- host (default to 'localhost'): The host name or ip address of the SAP HANA server
- port (default to 30015): The port number of the SAP HANA server
- username: The user name to connect to the SAP HANA server
- password: The password
- debug (default to false)
NOTE: By default, the default schema of the user is used for all tables which is the same as the username.
The SAP HANA connector uses node-hdb as the driver. See more information about configuration parameters, check https://github.com/SAP/node-hdb/blob/master/README.md.
Discovering Models
SAP HANA data sources allow you to discover model definition information from existing SAP HANA databases. See the following APIs:
- [dataSource.discoverModelDefinitions([username], fn)]
- [dataSource.discoverSchema([owner], name, fn)]
Model definition for SAP HANA
The model definition consists of the following properties:
- name: Name of the model, by default, it's the camel case of the table
- options: Model level operations and mapping to SAP HANA schema/table
- properties: Property definitions, including mapping to SAP HANA column
1 2 {"name": "Inventory", "options": { 3 "idInjection": false, 4 "hdb": { 5 "schema": "strongloop", 6 "table": "inventory" 7 } 8 }, "properties": { 9 "id": { 10 "type": "String", 11 "required": false, 12 "length": 64, 13 "precision": null, 14 "scale": null, 15 "hdb": { 16 "columnName": "id", 17 "dataType": "varchar", 18 "dataLength": 64, 19 "dataPrecision": null, 20 "dataScale": null, 21 "nullable": "NO" 22 } 23 }, 24 "productId": { 25 "type": "String", 26 "required": false, 27 "length": 20, 28 "precision": null, 29 "scale": null, 30 "id": 1, 31 "hdb": { 32 "columnName": "product_id", 33 "dataType": "varchar", 34 "dataLength": 20, 35 "dataPrecision": null, 36 "dataScale": null, 37 "nullable": "YES" 38 } 39 }, 40 "locationId": { 41 "type": "String", 42 "required": false, 43 "length": 20, 44 "precision": null, 45 "scale": null, 46 "id": 1, 47 "hdb": { 48 "columnName": "location_id", 49 "dataType": "varchar", 50 "dataLength": 20, 51 "dataPrecision": null, 52 "dataScale": null, 53 "nullable": "YES" 54 } 55 }, 56 "available": { 57 "type": "Number", 58 "required": false, 59 "length": null, 60 "precision": 32, 61 "scale": 0, 62 "hdb": { 63 "columnName": "available", 64 "dataType": "integer", 65 "dataLength": null, 66 "dataPrecision": 32, 67 "dataScale": 0, 68 "nullable": "YES" 69 } 70 }, 71 "total": { 72 "type": "Number", 73 "required": false, 74 "length": null, 75 "precision": 32, 76 "scale": 0, 77 "hdb": { 78 "columnName": "total", 79 "dataType": "integer", 80 "dataLength": null, 81 "dataPrecision": 32, 82 "dataScale": 0, 83 "nullable": "YES" 84 } 85 } 86 }} 87
Type Mapping
- Number
- Boolean
- String
- Object
- Date
- Array
JSON to SAP HANA Types
- String|JSON|Text|default: VARCHAR, default length is 1024
- Number: INTEGER
- Date: TIMESTAMP
- Timestamp: TIMESTAMP
- Boolean: VARCHAR(1)
SAP HANA Types to JSON
- VARCHAR(1): Boolean
- VARCHAR|NVARCHAR|ALPHANUM|SHORTTEXT: String
- VARBINARY: Binary;
- TINYINT|SMALLINT|INTEGER|BIGINT|SMALLDECIMAL|DECIMAL|REAL|DOUBLE: Number
- DATE|TIME|SECONDDATE|TIMESTAMP: Date
Destroying Models
Destroying models may result in errors due to foreign key integrity. Make sure to delete any related models first before calling delete on model's with relationships.
Auto Migrate / Auto Update
After making changes to your model properties you must call Model.automigrate()
or Model.autoupdate()
. Only call Model.automigrate()
on new models
as it will drop existing tables.
LoopBack SAP HANA connector creates the following schema objects for a given model:
- A table, for example, "product" under the default schema of the user
- A sequence with name "product_seq" if the primary key "id" is auto-increment
Running tests
npm test
- Prerequisites for saphana.discover.test.js: execute tables.sql in SAP HANA Studio
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
0 existing vulnerabilities detected
Reason
license file detected
Details
- Info: project has a license file: LICENSE:0
- Info: FSF or OSI recognized license: Apache License 2.0: LICENSE:0
Reason
Found 1/24 approved changesets -- score normalized to 0
Reason
project is archived
Details
- Warn: Repository is archived.
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
- Warn: no security policy file detected
- Warn: no security file to analyze
- Warn: no security file to analyze
- Warn: no security file to analyze
Reason
project is not fuzzed
Details
- Warn: no fuzzer integrations found
Reason
branch protection not enabled on development/release branches
Details
- Warn: branch protection not enabled for branch 'master'
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
- Warn: 0 commits out of 7 are checked with a SAST tool
Score
3
/10
Last Scanned on 2024-11-18
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More