Docs
Integrations
LiteLLM

πŸš… LiteLLM Integration

LiteLLM (GitHub (opens in a new tab)): Use any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs).

You can find more in-depth documentation in the LiteLLM docs (opens in a new tab).

There are three ways to integrate LiteLLM with Langfuse:

  1. LiteLLM Proxy with OpenAI SDK Wrapper, the proxy standardizes 100+ models on the OpenAI API schema and the Langfuse OpenAI SDK wrapper instruments the LLM calls.
  2. LiteLLM Python SDK which can send logs to Langfuse if the environment variables are set.
  3. LiteLLM Proxy which can send logs to Langfuse if enabled in the UI.

Integration

The LiteLLM proxy is a simple way to integrate LiteLLM with Langfuse. You can set the success and failure callbacks to Langfuse to log all responses.


To see a full end-to-end example, check out the LiteLLM cookbook.

Was this page useful?

Questions? We're here to help

Subscribe to updates