npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

openlit

v1.3.0

Published

OpenTelemetry-native Auto instrumentation library for monitoring LLM Applications, facilitating the integration of observability into your GenAI-driven projects

Downloads

356

Readme

Documentation | Quickstart | Typescript SDK

OpenLIT License Downloads GitHub Last Commit GitHub Contributors

Slack Discord X

OpenLIT Connections Banner

OpenLIT Typescript SDK is an OpenTelemetry-native Auto instrumentation library for monitoring LLM Applications, facilitating the integration of observability into your GenAI-driven projects. Designed with simplicity and efficiency, OpenLIT offers the ability to embed observability into your GenAI-driven projects effortlessly using just a single line of code.

Whether you're directly using LLM Libraries like OpenAI or Anthropic, OpenLIT seamlessly integrates observability into your applications, ensuring enhanced performance and reliability across diverse scenarios.

This project adheres to the Semantic Conventions proposed by the OpenTelemetry community. You can check out the current definitions here.

Auto Instrumentation Capabilities

| LLMs | | --------------------------------------------------------------------- | | ✅ OpenAI | ✅ ChromaDB | ✅ LiteLLM | | | ✅ Anthropic |

Supported Destinations

💿 Installation

npm install openlit

🚀 Getting Started

Step 1: Install OpenLIT

Open your command line or terminal and run:

npm install openlit

Step 2: Initialize OpenLIT in your Application

Integrating the OpenLIT into LLM applications is straightforward. Start monitoring for your LLM Application with just two lines of code:

import Openlit from 'openlit';

Openlit.init();

To forward telemetry data to an HTTP OTLP endpoint, such as the OpenTelemetry Collector, set the otlpEndpoint parameter with the desired endpoint. Alternatively, you can configure the endpoint by setting the OTEL_EXPORTER_OTLP_ENDPOINT environment variable as recommended in the OpenTelemetry documentation.

💡 Info: If you dont provide otlpEndpoint function argument or set the OTEL_EXPORTER_OTLP_ENDPOINT environment variable, OpenLIT directs the trace directly to your console, which can be useful during development.

To send telemetry to OpenTelemetry backends requiring authentication, set the otlpHeaders parameter with its desired value. Alternatively, you can configure the endpoint by setting the OTEL_EXPORTER_OTLP_HEADERS environment variable as recommended in the OpenTelemetry documentation.

Example



Add the following two lines to your application code:

import Openlit from 'openlit';

Openlit.init({ 
  otlpEndpoint: 'YOUR_OTEL_ENDPOINT',
  otlpHeaders: 'YOUR_OTEL_ENDPOINT_AUTH'
});


Add the following two lines to your application code:

import Openlit from "openlit"

Openlit.init()

Then, configure the your OTLP endpoint using environment variable:

export OTEL_EXPORTER_OTLP_ENDPOINT = "YOUR_OTEL_ENDPOINT"
export OTEL_EXPORTER_OTLP_HEADERS = "YOUR_OTEL_ENDPOINT_AUTH"

Step 3: Visualize and Optimize!

With the LLM Observability data now being collected and sent to OpenLIT, the next step is to visualize and analyze this data to get insights into your LLM application’s performance, behavior, and identify areas of improvement.

To begin exploring your LLM Application's performance data within the OpenLIT, please see the Quickstart Guide.

If you want to integrate and send metrics and traces to your existing observability tools, refer to our Connections Guide for detailed instructions.

Configuration

Below is a detailed overview of the configuration options available, allowing you to adjust OpenLIT's behavior and functionality to align with your specific observability needs:

| Argument | Description | Default Value | Required | | ------------------------ | ------------------------------------------------------------------------------------ | ------------------------------------------------------------------ | -------- | | environment | The deployment environment of the application. | "default" | No | | applicationName | Identifies the name of your application. | "default" | No | | tracer | An instance of OpenTelemetry Tracer for tracing operations. | undefined | No | | otlpEndpoint | Specifies the OTLP endpoint for transmitting telemetry data. | undefined | No | | otlpHeaders | Defines headers for the OTLP exporter, useful for backends requiring authentication. | undefined | No | | disableBatch | A flag to disable batch span processing, favoring immediate dispatch. | true | No | | traceContent | Enables tracing of content for deeper insights. | true | No | | disabledInstrumentations | List of instrumentations to disable. | undefined | No | | instrumentations | Object of instrumentation modules for manual patching | undefined | No | | pricing_json | URL or file path of the pricing JSON file. | https://github.com/openlit/openlit/blob/main/assets/pricing.json | No |

🌱 Contributing

Whether it's big or small, we love contributions 💚. Check out our Contribution guide to get started

Unsure where to start? Here are a few ways to get involved:

  • Join our Slack or Discord community to discuss ideas, share feedback, and connect with both our team and the wider OpenLIT community.

Your input helps us grow and improve, and we're here to support you every step of the way.

💚 Community & Support

Connect with the OpenLIT community and maintainers for support, discussions, and updates:

  • 🌟 If you like it, Leave a star on our GitHub
  • 🌍 Join our Slack or Discord community for live interactions and questions.
  • 🐞 Report bugs on our GitHub Issues to help us improve OpenLIT.
  • 𝕏 Follow us on X for the latest updates and news.