Pricing

100% Open Source. Forever Free.

OpenLIT is Apache 2.0 licensed. Self-host everything with zero licensing fees. A managed cloud option is coming soon for teams who prefer zero-ops.

Available Now

Self-Hosted

Run on your own infrastructure

$0/ forever

Apache 2.0 — no license key, no usage limits

  • Full LLM observability (60+ integrations)
  • OpenTelemetry-native traces & metrics
  • Token usage & cost tracking
  • GPU monitoring (NVIDIA + AMD)
  • Vector DB monitoring
  • Prompt Hub with versioning
  • Secrets Vault
  • Fleet Hub (multi-deployment)
  • LLM Evaluations
  • Custom model pricing
  • Manage organisations
  • Export to Grafana, Datadog, and any OTLP backend
  • Community support (GitHub, Discord)
Coming Soon

Cloud

Fully managed by the OpenLIT team

Join the waitlist to be notified at launch

  • Everything in Self-Hosted (60+ integrations)
  • Managed infrastructure — no ops overhead
  • Automatic upgrades
  • Dedicated support
  • SLA guarantees
  • SSO / SAML
  • Audit logs
  • Priority feature requests

Support the Project

OpenLIT is free forever. If it saves you time or money, consider sponsoring development via OpenCollective.

Token Supporter

$10/month

Show your appreciation and help keep OpenLIT maintained.

Sponsor on OpenCollective
  • Community supporter badge
  • Priority GitHub issue triage
  • Access to sponsor channel on Discord

Context Window Hero

$50/month

Directly fund new features and integrations.

Sponsor on OpenCollective
  • Everything in Token Supporter
  • Your logo in README sponsors section
  • Direct input on roadmap priorities
  • Monthly call with core team

How OpenLIT Compares

See how OpenLIT stacks up against other LLM observability tools.

Frequently Asked Questions

Is OpenLIT really free?

Yes. OpenLIT is Apache 2.0 licensed and free to self-host with no usage limits, no feature gates, and no license key required.

What do I need to self-host?

Docker and Docker Compose. Run `docker compose up -d` in the OpenLIT repo and you have the full stack running in under two minutes — UI, ClickHouse storage, and OpenTelemetry Collector included.

Can I send data to my existing Grafana or Datadog instance?

Yes. OpenLIT is OpenTelemetry-native. Configure the OTLP endpoint to point at any OTLP-compatible backend: Grafana, Datadog, New Relic, SigNoz, Jaeger, and more.

Does OpenLIT support GPU monitoring?

Yes. Enable GPU metrics with `openlit.init(collect_gpu_stats=True)`. OpenLIT collects utilization, VRAM usage, temperature, and power draw from NVIDIA and AMD GPUs.

When will the Cloud tier be available?

The OpenLIT Cloud managed service is in development. Join the waitlist by emailing [email protected] and you will be notified at launch.

Ready to Transform Your AI Observability?

Join thousands of developers using OpenLIT to build better, more reliable LLM applications. Get started in less than a minute.