Overview

This guide will walk you through setting up Langtrace’s Python SDK with the OpenTelemetry (OTEL) Collector to send traces to Langtrace’s Cloud or a self-hosted setup.

Configurations

OTEL Collector

Refer to the OTEL configuration page for details on running and configuring the OpenTelemetry Collector with a custom configuration file.

OTEL Configuration

Ensure that you configure the Langtrace API token in the OTEL collector configuration file before proceeding.

Python SDK

To send traces to the OTEL Collector, install the required OpenTelemetry exporter package:

pip install opentelemetry-exporter-otlp-proto-http

Implementation

Initialize the Langtrace SDK with a custom remote exporter using the OTLPSpanExporter to send traces.

langtrace_otel.py
from langtrace_python_sdk import langtrace
from openai import OpenAI
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter


# Configure the OTLP exporter to use the correct endpoint
otlp_endpoint = "http://localhost:4318/v1/traces"
otlp_exporter = OTLPSpanExporter(
    endpoint=otlp_endpoint,
    headers={"Content-Type": "application/json"}
)
langtrace.init(custom_remote_exporter=otlp_exporter, batch=False)


def chat_with_openai():
    client = OpenAI()
    messages = [
      {
          "role": "system",
          "content": "How many states of matter are there?",
      },
    ]
    chat_completion = client.chat.completions.create(
        messages=messages,
        stream=False,
        model="gpt-3.5-turbo",
    )
    print(chat_completion.choices[0].message.content)


chat_with_openai()

Run the script:

python langtrace_otel.py

Conclusion

By following this guide, you will have integrated OpenTelemetry support into your Python project, allowing you to send traces to an OpenTelemetry backend, including Langtrace’s Cloud or a self-hosted setup.