Docs
Integrations
Langchain
Tracing

Tracing for Langchain (Python & JS/TS)

Langfuse Tracing integrates with Langchain using Langchain Callbacks (Python (opens in a new tab), JS (opens in a new tab)). Thereby, the Langfuse SDK automatically creates a nested trace for every run of your Langchain applications.

Add Langfuse to your Langchain application

You can configure the integration via (1) constructor arguments or (2) environment variables. Get your Langfuse credentials from the Langfuse dashboard.

pip install langfuse
# Initialize Langfuse handler
from langfuse.callback import CallbackHandler
langfuse_handler = CallbackHandler(
    secret_key="sk-lf-...",
    public_key="pk-lf-...",
    host="https://cloud.langfuse.com", # 🇪🇺 EU region
  # host="https://us.cloud.langfuse.com", # 🇺🇸 US region
)
 
# Your Langchain code
 
# Add Langfuse handler as callback (classic and LCEL)
chain.invoke({"input": "<user_input>"}, config={"callbacks": [langfuse_handler]})

Also works for run and predict methods.

chain.run(input="<user_input>", callbacks=[langfuse_handler])
conversation.predict(input="<user_input>", callbacks=[langfuse_handler])

Done. Now you can explore detailed traces and metrics in the Langfuse dashboard.

Prefer end-to-end examples?

Additional Configuration

Optional constructor arguments

When initializing the Langfuse handler, you can pass the following optional arguments to use more advanced features.

PythonJS/TSTypeDescription
user_iduserIdstringThe current user.
session_idsessionIdstringThe current session.
releasereleasestringThe release of your application. See experimentation docs for details.
versionversionstringThe version of your application. See experimentation docs for details.
trace_namestringCustomize the name of the created traces. Defaults to name of chain.

Supported Langchain interfaces

Feature/interfacePythonJS/TS
LCEL
invoke()
run()
call()
predict()
async
batch()(✅)
streaming

We are interested in your feedback! Raise an issue on GitHub to request support for additional interfaces.

Interoperability with Langfuse SDKs

Use the Langchain integration in combination with the regular Langfuse SDKs if you want to:

  • Add non-Langchain related observations to the trace.
  • Group multiple Langchain runs into a single trace.

Learn more about the structure of a trace here.

from langfuse.decorators import langfuse_context, observe
 
# Create a trace via Langfuse decorators and get a Langchain Callback handler for it
@observe() # automtically log function as a trace to Langfuse
def main():
    # update trace attributes (e.g, name, session_id, user_id)
    langfuse_context.update_current_trace(
        name="custom-trace",
        session_id="user-1234",
        user_id="session-1234",
    )
    # get the langchain handler for the current trace
    langfuse_handler = langfuse_context.get_current_langchain_handler()
 
    # Your Langchain code
 
    # Add Langfuse handler as callback (classic and LCEL)
    chain.invoke({"input": "<user_input>"}, config={"callbacks": [langfuse_handler]})
 
main()

If you pass these callback handlers to your Langchain code, the events will be nested under the respective trace or span in the Langfuse.

See the cookbook for an example of this in action (Python).

Limitation: The input/output of the Langchain code will not be added to the trace or span. Adding them would cause unwanted side-effects if they are set manually or if you add multiple Langchain runs. If you want the input/output of the Langchain run on the trace/span, you need to add them yourself via the regular Langfuse SDKs.

Queuing and flushing

The Langfuse SDKs queue and batch events in the background to reduce the number of network requests and improve overall performance. In a long-running application, this works without any additional configuration.

If you are running a short-lived application, you need to shutdown Langfuse to ensure that all events are flushed before the application exits.

langfuse_handler.shutdown_async()

If you want to flush events synchronously without shutting down Langfuse, you can use the flush method.

langfuse_handler.flush_async()

Was this page useful?

Questions? We're here to help

Subscribe to updates