Gathering detailed insights and metrics for pino-elasticsearch
Gathering detailed insights and metrics for pino-elasticsearch
Gathering detailed insights and metrics for pino-elasticsearch
Gathering detailed insights and metrics for pino-elasticsearch
npm install pino-elasticsearch
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
180 Stars
237 Commits
67 Forks
8 Watching
5 Branches
44 Contributors
Updated on 24 Oct 2024
JavaScript (80.66%)
Shell (16.76%)
TypeScript (2.57%)
Cumulative downloads
Total Downloads
Last day
-9.6%
3,609
Compared to previous day
Last week
-2.7%
18,169
Compared to previous week
Last month
9.6%
76,777
Compared to previous month
Last year
54.8%
831,391
Compared to previous year
4
Load pino logs into Elasticsearch.
npm install pino-elasticsearch -g
pino-elasticsearch
To send pino logs to elasticsearch:
cat log | pino-elasticsearch --node http://localhost:9200
Flags
-h | --help Display Help
-v | --version display Version
-n | --node the URL where Elasticsearch is running
-i | --index the name of the index to use; default: pino
will replace %{DATE} with the YYYY-MM-DD date
-t | --type the name of the type to use; default: log
-f | --flush-bytes the number of bytes for each bulk insert; default: 1000
-t | --flush-interval time that the helper will wait before flushing; default: 30000
-l | --trace-level trace level for the elasticsearch client, default 'error' (info, debug, trace).
| --es-version specify the major version number of Elasticsearch (eg: 5, 6, 7)
(this is needed only if you are using Elasticsearch <= 7)
-u | --username Username to specify with authentication method
(can only be used in tandem with the 'password' flag)
-p | --password Password to specify with authentication method
(can only be used in tandem with the 'username' flag)
-k | --api-key Api key for authentication instead of username/password combination
-c | --cloud Id of the elastic cloud node to connect to
-r | --read-config the name of config file
--rejectUnauthorized Reject any connection which is not authorized with the list of supplied CAs; default: true
--opType the op_type to use (create, index)
"create" is required when using datastreams
1const pino = require('pino') 2const pinoElastic = require('pino-elasticsearch') 3 4const streamToElastic = pinoElastic({ 5 index: 'an-index', 6 node: 'http://localhost:9200', 7 esVersion: 7, 8 flushBytes: 1000 9}) 10 11const logger = pino({ level: 'info' }, streamToElastic) 12 13logger.info('hello world') 14// ...
If you want to output to multiple streams (transports and console), you can either use pino.transport
or pino.multistream
. Transport is recommended and can be used like:
1const pino = require('pino'); 2 3const pinoOptions = {}; 4 5return pino(pinoOptions, pino.transport({ 6 targets: [ 7 { 8 target: 'pino/file', // This will log to stdout 9 }, { 10 target: 'pino-elasticsearch', 11 options: { 12 index: 'an-index', 13 node: 'http://localhost:9200', 14 esVersion: 7, 15 flushBytes: 1000 16 } 17 } 18 ] 19}));
Please note: The options
must be serializable. For example if you use a function as index
, you will get an error. Find an example at pino-pretty how you can use it.
If you want to use a custom Connection class for the Elasticsearch client, you can pass it as an option when using as a module. See the Elasticsearch client docs for Connection.
1const pino = require('pino') 2const pinoElastic = require('pino-elasticsearch') 3 4const Connection = <custom Connection> 5 6const streamToElastic = pinoElastic({ 7 index: 'an-index', 8 node: 'http://localhost:9200', 9 esVersion: 7, 10 flushBytes: 1000, 11 Connection 12}) 13 14const logger = pino({ level: 'info' }, streamToElastic) 15 16logger.info('hello world') 17// ...
Assuming your Elasticsearch instance is running and accessible, there are a couple of common problems that will cause indices or events (log entries) to not be created in Elasticsearch when using this library. Developers can get feedback on these problems by listening for events returned by the stream handler.
The stream handler returned from the default factory function is an EventEmitter
. Developers can use this interface to debug issues encountered by the pino-elasticsearch
library.
1const pinoElastic = require('pino-elasticsearch'); 2 3const streamToElastic = pinoElastic({ 4 index: 'an-index', 5 node: 'http://localhost:9200', 6 esVersion: 7, 7 flushBytes: 1000 8}) 9 10streamToElastic.on('<event>', (error) => console.log(event));
The following table lists the events emitted by the stream handler:
Event | Callback Signature | Description |
---|---|---|
unknown | (line: string, error: string) => void | Event received by pino-elasticsearch is unparsable (via JSON.parse ) |
insertError | (error: Error & { document: Record<string, any> }) => void | The bulk insert request to Elasticsearch failed (records dropped). |
insert | (stats: Record<string, any>) => void | Called when an insert was successfully performed |
error | (error: Error) => void | Called when the Elasticsearch client fails for some other reason |
There are a few common problems developers will encounter when initially using this library. The next section discusses these issues.
Pino output is not JSON
Any transform of the stream (like pino-pretty
) that results in an non-JSON stream output will be ignored (the unknown
event will be emitted).
1const pino = require('pino'); 2const pinoElastic = require('pino-elasticsearch'); 3 4const streamToElastic = pinoElastic({ 5 index: 'an-index', 6 node: 'http://localhost:9200', 7 esVersion: 7, 8 flushBytes: 1000 9}) 10 11streamToElastic.on( 12 'unknown', 13 (line, error) => 14 console.log('Expect to see a lot of these with Pino Pretty turned on.') 15); 16 17const logger = pino({ 18 prettyPrint: true, 19}, streamToElastic)
Events do not match index schema/mappings
Elasticsearch schema mappings are strict and if events do not match their format, the events will be dropped. A typical example is if you use the default pino
format with the logs-
index in Elasticsearch. The default installation of Elasticsearch includes a data pipeline mapped to the logs-
index prefix. This is intended to be used by the Beats and Logstash aggregators and requires @timestamp
and @message
fields. The default pino
setup uses time
and msg
. Attempting to write events to the logs-
index without mapping/transforming the pino
schema will result in events being dropped.
Developers can troubleshoot insert errors by watching for the insertError
event.
1streamToElastic.on( 2 'insertError', 3 (error) => { 4 const documentThatFailed = error.document; 5 console.log(`An error occurred insert document:`, documentThatFailed); 6 } 7);
If you want to use Elastic Common Schema, you should install @elastic/ecs-pino-format
, as the ecs
option of this module has been removed.
1const pino = require('pino') 2const ecsFormat = require('@elastic/ecs-pino-format')() 3const pinoElastic = require('pino-elasticsearch') 4 5const streamToElastic = pinoElastic({ 6 index: 'an-index', 7 node: 'http://localhost:9200', 8 esVersion: 7, 9 flushBytes: 1000 10}) 11 12const logger = pino({ level: 'info', ...ecsFormat }, streamToElastic) 13 14logger.info('hello world') 15// ...
You can then use Kibana to
browse and visualize your logs.
Note: This transport works only with Elasticsearch version ≥ 5.
It is possible to customize the index name for every log line just providing a function to the index
option:
1const pino = require('pino') 2const pinoElastic = require('pino-elasticsearch') 3 4const streamToElastic = pinoElastic({ 5 index: function (logTime) { 6 // the logTime is a ISO 8601 formatted string of the log line 7 return `awesome-app-${logTime.substring(5, 10)}` 8 }, 9 node: 'http://localhost:9200' 10}) 11// ...
The function must be sync, doesn't throw and return a string.
Indexing to datastreams requires the opType
to be set to create
:
1const pino = require('pino') 2const pinoElastic = require('pino-elasticsearch') 3 4const streamToElastic = pinoElastic({ 5 index: "type-dataset-namespace", 6 node: 'http://localhost:9200', 7 opType: 'create' 8}) 9// ...
1const pino = require('pino') 2const ecsFormat = require('@elastic/ecs-pino-format') 3const pinoElastic = require('pino-elasticsearch') 4 5const streamToElastic = pinoElastic({ 6 index: 'an-index', 7 node: 'http://localhost:9200', 8 esVersion: 7, 9 flushBytes: 1000 10}) 11 12// Capture errors like unable to connect Elasticsearch instance. 13streamToElastic.on('error', (error) => { 14 console.error('Elasticsearch client error:', error); 15}) 16// Capture errors returned from Elasticsearch, "it will be called every time a document can't be indexed". 17streamToElastic.on('insertError', (error) => { 18 console.error('Elasticsearch server error:', error); 19}) 20 21const logger = pino({ level: 'info', ...ecsFormat() }, streamToElastic) 22 23logger.info('hello world')
If you need to use basic authentication to connect with the Elasticsearch cluster, pass the credentials in the URL:
cat log | pino-elasticsearch --node https://user:pwd@localhost:9200
Alternatively you can supply a combination of username
and password
OR api-key
:
cat log | pino-elasticsearch --node https://localhost:9200 -u user -p pwd
cat log | pino-elasticsearch --node https://localhost:9200 --api-key=base64EncodedKey
Elastic cloud option cloud
is also supported:
1cat log | pino-elasticsearch --cloud=name:bG9jYWxob3N0JGFiY2QkZWZnaA== --api-key=base64EncodedKey
Note: When using the cli, if you pass username/password AND an apiKey the apiKey will take precedence over the username/password combination.
You can also include the auth
field in your configuration like so:
1const pinoElastic = require('pino-elasticsearch')
2
3const streamToElastic = pinoElastic({
4 index: 'an-index',
5 node: 'http://localhost:9200',
6 auth: {
7 username: 'user',
8 password: 'pwd'
9 },
10 esVersion: 7,
11 flushBytes: 1000
12})
Alternatively you can pass an apiKey
instead:
1const pinoElastic = require('pino-elasticsearch')
2
3const streamToElastic = pinoElastic({
4 index: 'an-index',
5 node: 'http://localhost:9200',
6 cloud: {
7 id: 'name:bG9jYWxob3N0JGFiY2QkZWZnaA=='
8 },
9 auth: {
10 apiKey: 'apikey123'
11 },
12 esVersion: 7,
13 flushBytes: 1000
14})
For a full list of authentication options when using elastic, check out the authentication configuration docs
Setting up pino-elasticsearch is easy, and you can use the bundled
docker-compose.yml
file to bring up both
Elasticsearch and
Kibana.
You will need docker and
docker-compose, then in this project
folder, launch docker-compose -f docker-compose-v8.yml up
.
You can test it by launching node example | pino-elasticsearch
, in
this project folder. You will need to have pino-elasticsearch
installed globally.
This project was kindly sponsored by nearForm.
Licensed under MIT.
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
no dangerous workflow patterns detected
Reason
0 existing vulnerabilities detected
Reason
license file detected
Details
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
Found 12/21 approved changesets -- score normalized to 5
Reason
dependency not pinned by hash detected -- score normalized to 2
Details
Reason
2 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 1
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
project is not fuzzed
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
security policy file not detected
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Score
Last Scanned on 2024-11-18
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More