@maximai/maxim-js
v5.2.6
Published
Maxim AI JS SDK. Visit https://getmaxim.ai for more info.
Downloads
774
Readme
Maxim SDK
This is JS/TS SDK for enabling Maxim observability. Maxim is an enterprise grade evaluation and observability platform.
How to integrate
Install
npm install @maximai/maxim-js
Initialize Maxim logger
const maxim = new Maxim({ apiKey: "maxim-api-key" });
const logger = await maxim.logger({ id: "log-repository-id" });
Start sending traces
// Start a trace
logger.trace({ id: "trace-id" });
// Add a span
logger.traceSpan("trace-id", { id: "span-id", name: "Intent detection service" });
// Add llm call to this span
const generationId = uuid();
logger.spanGeneration("span-id", {
id: generationId,
name: "test-inference",
model: "gpt-3.5-turbo-16k",
messages: [
{
role: "user",
content: "Hello, how are you?",
},
],
modelParameters: {
temperature: 3,
},
provider: "openai",
});
// Make the actual call to the LLM
const result = llm_call();
// Log back the result
logger.generationResult(generationId, result);
// Ending span
logger.spanEnd("span-id");
// Ending trace
logger.traceEnd("trace-id");
Integrations with other frameworks
Langchain
Use our Maxim Langchain Tracer to integrate Maxim observability with just 2 lines of code.
Version changelog
v5.2.6
- Fix: fixes incorrect message format for openai structured output params
v5.2.5
- Fix: fixes incorrect mapping of messages for old langchain sdk
v5.2.4
- Fix: config fixes for static classes
v5.2.3
- Improvement: Adds AWS lambda support for Maxim SDK.
v5.2.2
- Fix: There was a critical bug in the implementation of HTTP POST calls where some of the payloads were getting truncated.
v5.2.1
- Fix: For ending any entity, we make sure endTimestamp is captured from client side. This was not the case earlier in some scenarios.
- Fix: Data payload will always be a valid JSON
v5.2.0
- Improvement: Adds exponential retries to the API calls to Maxim server.
v5.1.2
- Improvement: Readme updates.
v5.1.1
- Improvement: Detailed logs in debug mode
v5.1.0
- Adds scaffold to support LangchainTracer for Maxim SDK.
v5.0.3
- Exposes MaximLogger for writing wrappers for different developer SDKs.
v5.0.2
- Adds more type-safety for generation messages
v5.0.1
- Adds support input/output for traces
v5.0.0
- Adds support for node 12+
V4.0.2
- Fixed a critical bug related to pushing generation results to the Maxim platform
- Improved error handling for network connectivity issues
- Enhanced performance when logging large volumes of data
V4.0.1
- Adds retrieval updates
- Adds ChatMessage support
v4.0.0 (Breaking changes)
- Adds prompt chain support
- Adds vision model support for prompts
v3.0.7
- Adds separate error reporting method for generations
v3.0.6
- Adds top level methods for easier SDK integration
v3.0.5
- Fixes logs push error
v3.0.4
- Minor bug fixes
v3.0.3
- Updates default base url
v3.0.2
- Prompt selection algorithm v2
v3.0.1
- Minor bug fixes
v3.0.0
- Moves to new base URL
- Adds all new logging support
v2.1.0
- Adds support for adding dataset entries via SDK.
v2.0.0
- Folders, Tags and advanced filtering support.
- Add support for customizing default matching algorithm.
v1.1.0
- Adds realtim sync for prompt deployment.
v1.0.0
- Adds support for deployment variables and custom fields. [Breaking change from earlier versions.]
v0.5.0
- Adds support for new SDK apis.
v0.4.0
- Adds support for custom fields for Prompts.