Langfuse is an Open Source LLM Engineering Platform designed to debug and improve LLM applications. It integrates with Langchain, OpenAI, LlamaIndex, LiteLLM, and more, offering traces, evals, prompt management, and metrics. It's built with security in mind, SOC 2 Type II and ISO 27001 certified, GDPR compliant, and aligned with HIPAA.
Freemium
How to use Langfuse?
Langfuse can be used to trace, evaluate, and manage prompts in LLM applications. It helps in debugging applications faster, managing prompts collaboratively, testing different prompts and models, collecting user feedback, and tracking cost, latency, and quality. It's suitable for teams building complex LLM apps.
Langfuse 's Core Features
Detailed production traces to debug LLM applications faster
Version and deploy prompts collaboratively with low latency
Test different prompts and models in the Langfuse UI
Collect user feedback and run evaluation functions
Derive datasets from production data for fine-tuning models
Track cost, latency, and quality metrics
Langfuse 's Use Cases
Developers can debug LLM applications faster with detailed production traces.
Teams can collaboratively version and deploy prompts with low latency.
Product managers can test different prompts and models directly in the UI.
Data scientists can collect user feedback and run evaluations to improve models.
Startups can leverage the Hobby plan for POCs without any cost.
Langfuse 's Pricing
Hobby
Free
Get started, no credit card required. Great for hobby projects and POCs.
Core
$59/month
For production projects. Includes access to more history, usage and unlimited users.
Pro
$199/month
For scaling projects. Unlimited history, high rate limits, all features.