Docs
Integrations
Overview

Integration Overview

Integrate your application with Langfuse to explore production traces and metrics.

Objective:

  1. Capture traces of your application
  2. Add scores to these traces to measure/evaluate quality of outputs

There are currently six main ways to integrate with Langfuse:

Main Integrations

IntegrationSupportsDescription
SDKPython, JS/TSManual instrumentation using the SDKs for full flexibility.
OpenAIPython, JS/TSAutomated instrumentation using drop-in replacement of OpenAI SDK.
LangchainPython, JS/TSAutomated instrumentation by passing callback handler to Langchain application.
LlamaIndexPythonAutomated instrumentation via LlamaIndex callback system.
HaystackPythonAutomated instrumentation via Haystack content tracing system.
LiteLLMPython, JS/TS (proxy only)Use any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs).
APIDirectly call the public API. OpenAPI spec available.

Packages integrated with Langfuse

NameDescription
InstructorLibrary to get structured LLM outputs (JSON, Pydantic)
MirascopePython toolkit for building LLM applications.
AI SDK by VercelTypescript SDK that makes streaming LLM outputs super easy.
FlowiseJS/TS no-code builder for customized LLM flows.
LangflowPython-based UI for LangChain, designed with react-flow to provide an effortless way to experiment and prototype flows.
SuperagentOpen Source AI Assistant Framework & API for prototyping and deployment of agents.

Unsure which integration to choose? Ask us on Discord or in the chat.

Was this page useful?

Questions? We're here to help

Subscribe to updates