npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@langtrase/typescript-sdk

v6.3.0

Published

A typescript SDK for Langtrace

Downloads

1,053

Readme

Langtrace

Open Source & Open Telemetry(OTEL) Observability for LLM applications

Static Badge Static Badge Static Badge Static Badge


Langtrace is an open source observability software which lets you capture, debug and analyze traces and metrics from all your applications that leverages LLM APIs, Vector Databases and LLM based Frameworks.

Open Telemetry Support

The traces generated by Langtrace adhere to Open Telemetry Standards(OTEL). We are developing semantic conventions for the traces generated by this project. You can checkout the current definitions in this repository. Note: This is an ongoing development and we encourage you to get involved and welcome your feedback.


Langtrace Cloud ☁️

To use the managed SaaS version of Langtrace, follow the steps below:

  1. Sign up by going to this link.
  2. Create a new Project after signing up. Projects are containers for storing traces and metrics generated by your application. If you have only one application, creating 1 project will do.
  3. Generate an API key by going inside the project.
  4. In your application, install the Langtrace SDK and initialize it with the API key you generated in the step 3.
  5. The code for installing and setting up the SDK is shown below

Getting Started

Get started by adding simply three lines to your code!

npm i @langtrase/typescript-sdk
import * as Langtrace from '@langtrase/typescript-sdk' // Must precede any llm module imports
Langtrace.init({ api_key: <your_api_key> })

OR

import * as Langtrace from '@langtrase/typescript-sdk'; // Must precede any llm module imports
LangTrace.init(); // LANGTRACE_API_KEY as an ENVIRONMENT variable

Langtrace Self Hosted

Get started by adding simply two lines to your code and see traces being logged to the console!

npm i @langtrase/typescript-sdk
import * as Langtrace from '@langtrase/typescript-sdk'; // Must precede any llm module imports
Langtrace.init({
  write_spans_to_console: true,
  api_host: '<HOSTED_URL>/api/trace',
});

Langtrace self hosted custom exporter

Get started by adding simply three lines to your code and see traces being exported to your remote location!

npm i @langtrase/typescript-sdk
import * as Langtrace from '@langtrase/typescript-sdk' // Must precede any llm module imports
Langtrace.init({ custom_remote_exporter: <your_exporter>, batch:<true or false>})

Error Reporting to Langtrace

By default all sdk errors are reported to langtrace via Sentry. This can be disabled by setting the following enviroment variable to False like so LANGTRACE_ERROR_REPORTING=False

Additional Customization

  • withLangTraceRootSpan - this function is designed to organize and relate different spans, in a hierarchical manner. When you're performing multiple operations that you want to monitor together as a unit, this function helps by establishing a "parent" (LangtraceRootSpan or whatever is passed to name) span. Then, any calls to the LLM APIs made within the given function (fn) will be considered "children" of this parent span. This setup is especially useful for tracking the performance or behavior of a group of operations collectively, rather than individually. See example
/**
 * @param fn  The function to be executed within the context of the root span. The function should accept the spanId and traceId as arguments
 * @param name Name of the root span
 * @param spanAttributes Attributes to be added to the root span
 * @param spanKind The kind of span to be created
 * @returns result of the function
 */
export async function withLangTraceRootSpan<T>(
  fn: (spanId: string, traceId: string) => Promise<T>,
  name = 'LangtraceRootSpan',
  spanKind: SpanKind = SpanKind.INTERNAL
): Promise<T>;
  • withAdditionalAttributes - this function is designed to enhance the traces by adding custom attributes to the current context. These custom attributes provide extra details about the operations being performed, making it easier to analyze and understand their behavior. See example
/**
 *
 * @param fn function to be executed within the context with the custom attributes added to the current context
 * @param attributes custom attributes to be added to the current context.
 * Attributes can also be an awaited Promise<Record<string, any>>. E.g withAdditionalAttributes(()=>{// Do something}, await getAdditionalAttributes()) // Assuming you have a function called getAdditionalAttributes defined in your code
 * @returns result of the function
 */
export async function withAdditionalAttributes<T>(
  fn: () => Promise<T>,
  attributes: Record<string, any> | Promise<Record<string, any>>
): Promise<T>;
  • getPromptFromRegistry - Fetches either the latest prompt from the prompt registry or a specific version passed in through options. See example
/**
 * Fetches a prompt from the registry.
 *
 * @param promptRegistryId - The ID of the prompt registry.
 * @param options - Configuration options for fetching the prompt:
 *    - `prompt_version` - Fetches the prompt with the specified version. If not provided, the live prompt will be fetched. If there is no live prompt, an error will be thrown.
 *    - `variables`: - Replaces the variables in the prompt with the provided values. Each key of the object should be the variable name, and the corresponding value should be the value to replace.
 * @returns LangtracePrompt - The fetched prompt with variables replaced as specified.
 */
export const getPromptFromRegistry = async (promptRegistryId: string, options?: { prompt_version?: number, variables?: Record<string, string> }): Promise<LangtracePrompt>
  • sendUserFeedback - Allows submission of feedback on a users LLM interaction. This function must be used in tandem with the withLangtraceRootSpan function. See example
/**
 *
 * @param userId id of the user giving feedback
 * @param score score of the feedback
 * @param traceId traceId of the llm interaction. This is available when the inteaction is wrapped in withLangtraceRootSpan
 * @param spanId spanId of the llm interaction. This is available when the inteaction is wrapped in withLangtraceRootSpan
 *
 */
export const sendUserFeedback = async ({ userId, userScore, traceId, spanId }: EvaluationAPIData): Promise<void>

Supported integrations

Langtrace automatically captures traces from the following vendors:

| Vendor | Type | Typescript SDK | Python SDK | | ------------ | --------------- | ------------------ | ------------------------------- | | OpenAI | LLM | :white_check_mark: | :white_check_mark: | | Anthropic | LLM | :white_check_mark: | :white_check_mark: | | Azure OpenAI | LLM | :white_check_mark: | :white_check_mark: | | Cohere | LLM | :white_check_mark: | :white_check_mark: | | Groq | LLM | :x: | :white_check_mark: | | Perplexity | LLM | :white_check_mark: | :white_check_mark: | | Gemini | LLM | :white_check_mark: | :white_check_mark: | | Mistral | LLM | :white_check_mark: | :white_check_mark: | | xAI | LLM | :white_check_mark: | :white_check_mark: | | Langchain | Framework | :x: | :white_check_mark: | | LlamaIndex | Framework | :white_check_mark: | :white_check_mark: | | Langgraph | Framework | :x: | :white_check_mark: | | AWS Bedrock | Framework | :white_check_mark: | :x: | | DSPy | Framework | :x: | :white_check_mark: | | CrewAI | Framework | :x: | :white_check_mark: | | Ollama | Framework | :white_check_mark: | :white_check_mark: | | VertexAI | Framework | :white_check_mark: | :white_check_mark: | | VercelAI | Framework | :white_check_mark: | :x: | | Pinecone | Vector Database | :white_check_mark: | :white_check_mark: | | ChromaDB | Vector Database | :white_check_mark: | :white_check_mark: | | QDrant | Vector Database | :white_check_mark: | :white_check_mark: | | Weaviate | Vector Database | :white_check_mark: | :white_check_mark: | | PGVector | Vector Database | :white_check_mark: | :white_check_mark: (SQLAlchemy) |

Feature Requests and Issues


Contributions

We welcome contributions to this project. To get started, fork this repository and start developing. To get involved, join our Discord workspace.


Security

To report security vulnerabilites, email us at [email protected]. You can read more on security here.


License

  • Langtrace application is licensed under the AGPL 3.0 License. You can read about this license here.
  • Langtrace SDKs are licensed under the Apache 2.0 License. You can read about this license here.