Skip to content

lmnr-ai/lmnr

Repository files navigation

Static Badge X (formerly Twitter) Follow Static Badge

Laminar logo

Laminar

Laminar is an open-source observability platform purpose-built for AI agents.

  • Tracing. Docs
    • OpenTelemetry-native powerful tracing SDK - 1 line of code to automatically trace Vercel AI SDK, Browser Use, Stagehand, LangChain, OpenAI, Anthropic, Gemini, and more.
  • Evals. Docs
    • Unopinionated, extensible SDK and CLI for running evals locally or in CI/CD pipeline.
    • UI for visualizing evals and comparing results.
  • AI monitoring. Docs
    • Define events with natural language descriptions to track issues, logical errors, and custom behavior of your agent.
  • SQL access to all data. Docs
    • Query traces, metrics, and events with a built-in SQL editor. Bulk create datasets from queries. Available via API.
  • Dashboards. Docs
    • Powerful dashboard builder for traces, metrics, and events with support of custom SQL queries.
  • Data annotation & Datasets. Docs
    • Custom data rendering UI for fast data annotation and dataset creation for evals.
  • Extremely high performance.
    • Written in Rust 🦀
    • Custom realtime engine for viewing traces as they happen.
    • Ultra-fast full-text search over span data.
    • gRPC exporter for tracing data.

traces

Documentation

Check out full documentation here docs.laminar.sh.

Getting started

The fastest and easiest way to get started is with our managed platform -> laminar.sh

Self-hosting with Docker compose

Laminar is very easy to self-host locally. For a quick start, clone the repo and start the services with docker compose:

git clone https://github.com/lmnr-ai/lmnr
cd lmnr
docker compose up -d

This will spin up a lightweight but full-featured version of the stack. This is good for a quickstart or for lightweight usage. You can access the UI at http://localhost:5667 in your browser.

You will also need to properly configure the SDK, with baseUrl and correct ports. See guide on self-hosting.

For production environment, we recommend using our managed platform or docker compose -f docker-compose-full.yml up -d.

Contributing

For running and building Laminar locally, or to learn more about docker compose files, follow the guide in Contributing.

TS quickstart

First, create a project and generate a project API key. Then,

npm add @lmnr-ai/lmnr

It will install Laminar TS SDK and all instrumentation packages (OpenAI, Anthropic, LangChain ...)

To start tracing LLM calls just add

import { Laminar } from '@lmnr-ai/lmnr';
Laminar.initialize({ projectApiKey: process.env.LMNR_PROJECT_API_KEY });

To trace inputs / outputs of functions use observe wrapper.

import { OpenAI } from 'openai';
import { observe } from '@lmnr-ai/lmnr';

const client = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });

const poemWriter = observe({name: 'poemWriter'}, async (topic) => {
  const response = await client.chat.completions.create({
    model: "gpt-4o-mini",
    messages: [{ role: "user", content: `write a poem about ${topic}` }],
  });
  return response.choices[0].message.content;
});

await poemWriter();

Python quickstart

First, create a project and generate a project API key. Then,

pip install --upgrade 'lmnr[all]'

It will install Laminar Python SDK and all instrumentation packages. See list of all instruments here

To start tracing LLM calls just add

from lmnr import Laminar
Laminar.initialize(project_api_key="<LMNR_PROJECT_API_KEY>")

To trace inputs / outputs of functions use @observe() decorator.

import os
from openai import OpenAI

from lmnr import observe, Laminar
Laminar.initialize(project_api_key="<LMNR_PROJECT_API_KEY>")

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

@observe()  # annotate all functions you want to trace
def poem_writer(topic):
    response = client.chat.completions.create(
        model="gpt-4o",
        messages=[
            {"role": "user", "content": f"write a poem about {topic}"},
        ],
    )
    poem = response.choices[0].message.content
    return poem

if __name__ == "__main__":
    print(poem_writer(topic="laminar flow"))

Client libraries

To learn more about instrumenting your code, check out our client libraries:

NPM Version PyPI - Version