Skip to content

botanu-ai/botanu-sdk-typescript

Botanu SDK for TypeScript

License

Event-level cost attribution for AI workflows, built on OpenTelemetry.

An event is one business transaction — resolving a support ticket, processing an order, generating a report. Each event may involve multiple runs (LLM calls, retries, sub-workflows) across multiple services. By correlating every run to a stable event_id, Botanu gives you per-event cost attribution and outcome tracking without sampling artifacts.

Getting Started

npm install @botanu/sdk @opentelemetry/api

One install. Includes OTel SDK, OTLP exporter, and auto-instrumentation for 30+ libraries.

Option 1: Zero-code (recommended for intermediate services)

No code changes. Add a CLI flag and set env vars:

BOTANU_API_KEY=btnu_live_... \
BOTANU_SERVICE_NAME=my-service \
node --require @botanu/sdk/register dist/server.js

Or in your Dockerfile:

ENV BOTANU_API_KEY=btnu_live_...
ENV BOTANU_SERVICE_NAME=my-service
CMD ["node", "--require", "@botanu/sdk/register", "dist/server.js"]

Or in package.json:

{
  "scripts": {
    "start": "node --require @botanu/sdk/register dist/server.js"
  }
}

Option 2: YAML config

Create a botanu.yaml in your project root:

service:
  name: my-service
  environment: production

otlp:
  endpoint: https://ingest.botanu.ai
  headers:
    Authorization: "Bearer ${BOTANU_API_KEY}"

Then use the zero-code register or call enable() in your code — config is loaded automatically.

Option 3: Code (for workflow entry points)

import { enable, botanuWorkflow, emitOutcome } from '@botanu/sdk';

enable(); // reads config from env vars or botanu.yaml

const doWork = botanuWorkflow(
  { name: 'my-workflow', eventId: 'evt-001', customerId: 'cust-42' },
  async () => {
    const result = await doSomething();
    emitOutcome('success');
    return result;
  },
);

When to use what

  • Workflow entry point (the service that starts the workflow): Use Option 3 with botanuWorkflow() decorator
  • Intermediate/downstream services (just passing context through): Use Option 1 or 2 — zero code, context propagates automatically via W3C Baggage

See the Quick Start guide for a full walkthrough.

Documentation

Topic Description
Installation Install and configure the SDK
Quick Start Get up and running in 5 minutes
Configuration Environment variables and options
Core Concepts Events, runs, context propagation, architecture
LLM Tracking Track model calls and token usage
Data Tracking Database, storage, and messaging
Outcomes Record business outcomes for ROI
Auto-Instrumentation Supported libraries and frameworks
Kubernetes Zero-code instrumentation at scale
API Reference Decorators, tracking API, configuration
Best Practices Recommended patterns

Requirements

  • Node.js 18+
  • OpenTelemetry Collector (recommended for production)

Contributing

We welcome contributions from the community. Please read our Contributing Guide before submitting a pull request.

This project requires DCO sign-off on all commits:

git commit -s -m "Your commit message"

Looking for a place to start? Check the good first issues.

Community

Governance

See GOVERNANCE.md for details on roles, decision-making, and the contributor ladder.

Current maintainers are listed in MAINTAINERS.md.

Security

To report a security vulnerability, please use GitHub Security Advisories or see SECURITY.md for full details. Do not file a public issue.

Code of Conduct

This project follows the LF Projects Code of Conduct. See CODE_OF_CONDUCT.md.

License

Apache License 2.0

About

No description, website, or topics provided.

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors