Gathering detailed insights and metrics for @traceloop/instrumentation-azure
Gathering detailed insights and metrics for @traceloop/instrumentation-azure
Gathering detailed insights and metrics for @traceloop/instrumentation-azure
Gathering detailed insights and metrics for @traceloop/instrumentation-azure
Sister project to OpenLLMetry, but in Typescript. Open-source observability for your LLM application, based on OpenTelemetry
npm install @traceloop/instrumentation-azure
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
268 Stars
300 Commits
28 Forks
1 Watching
51 Branches
15 Contributors
Updated on 28 Nov 2024
TypeScript (97.84%)
JavaScript (2.14%)
Shell (0.02%)
Cumulative downloads
Total Downloads
Last day
-53.7%
131
Compared to previous day
Last week
-22.2%
1,666
Compared to previous week
Last month
-31.3%
10,114
Compared to previous month
Last year
0%
67,544
Compared to previous year
Open-source observability for your LLM application
๐ New: Our semantic conventions are now part of OpenTelemetry! Join the discussion and help us shape the future of LLM observability.
OpenLLMetry-JS is a set of extensions built on top of OpenTelemetry that gives you complete observability over your LLM application. Because it uses OpenTelemetry under the hood, it can be connected to your existing observability solutions - Datadog, Honeycomb, and others.
It's built and maintained by Traceloop under the Apache 2.0 license.
The repo contains standard OpenTelemetry instrumentations for LLM providers and Vector DBs, as well as a Traceloop SDK that makes it easy to get started with OpenLLMetry-JS, while still outputting standard OpenTelemetry data that can be connected to your observability stack. If you already have OpenTelemetry instrumented, you can just add any of our instrumentations directly.
The easiest way to get started is to use our SDK. For a complete guide, go to our docs.
Install the SDK:
1npm install --save @traceloop/node-server-sdk
Then, to start instrumenting your code, just add these 2 lines to your code:
1import * as traceloop from "@traceloop/node-server-sdk"; 2 3traceloop.initialize();
Make sure to import
the SDK before importing any LLM module.
That's it. You're now tracing your code with OpenLLMetry-JS! If you're running this locally, you may want to disable batch sending, so you can see the traces immediately:
1traceloop.initialize({ disableBatch: true });
Now, you need to decide where to export the traces to.
See our docs for instructions on connecting to each one.
OpenLLMetry-JS can instrument everything that OpenTelemetry already instruments - so things like your DB, API calls, and more. On top of that, we built a set of custom extensions that instrument things like your calls to OpenAI or Anthropic, or your Vector DB like Pinecone, Chroma, or Weaviate.
Whether it's big or small, we love contributions โค๏ธ Check out our guide to see how to get started.
Not sure where to get started? You can:
No vulnerabilities found.
No security vulnerabilities found.