Datadog is a powerful platform for monitoring the performance of your applications. This guide focuses on integrating Langtrace AI with Datadog APM for distributed tracing of your LLM powered applications.
Initialize the Langtrace SDK in your application with your OTLP collector endpoint:
Copy
import osfrom openai import OpenAIfrom langtrace_python_sdk import langtracefrom langtrace_python_sdk.utils.with_root_span import with_langtrace_root_spanfrom opentelemetry.exporter.otlp.proto.grpc.trace_exporter import \ OTLPSpanExporter# Set up the OTLP exporter with the endpoint from your collectorotlp_exporter = OTLPSpanExporter( # This should match your collector's OTLP gRPC endpoint endpoint=os.environ.get("OTEL_EXPORTER_OTLP_ENDPOINT"), insecure=True # Set to False if using HTTPS)# set up langtracelangtrace.init(custom_remote_exporter=otlp_exporter)# rest of your code@with_langtrace_root_span() # This optional annotation is used to create a parent root span for the entire applicationdef app(): client = OpenAI( api_key="<YOUR_OPENAI_API_KEY>") response = client.chat.completions.create( model="gpt-4o-mini", messages=[ { "role": "system", "content": "How many states of matter are there?" } ], ) print(response.choices[0].message.content)app()
Note: The OTLP collector is setup using a config file in this example. The config file is shown below:
Once the application is running, you should see traces in your Datadog APM dashboard as shown below:Missing Libraries: Verify that all necessary libraries are installed. If you encounter any errors, reinstall the libraries using the provided commands.Additional Resources