Elastic APM
Langtrace and Elastic APM Integration Guide
Overview
Elastic APM is a powerful solution for monitoring the performance of your applications. As Elastic APM is OpenTelemetry (OTel) compatible by default, you can easily integrate it with Langtrace to monitor your LLM applications.
Environment Variables
Set up the following environment variables to enable OpenTelemetry tracing and send traces to Elastic APM:
We support all OpenTelemetry environment variables. You can also use the legacy format:
RAG Application Example
Here’s an example of monitoring a Retrieval-Augmented Generation (RAG) chatbot application that uses Elastic Search for document search and Azure OpenAI for generating responses.
Initialize Langtrace
Configure Azure OpenAI (if using)
Set up the necessary Azure OpenAI environment variables:
Monitoring in Elastic APM
After setting up your application, you can monitor your RAG application’s performance in the Elastic APM dashboard. The interface provides deep insights into your application’s behavior:
The main dashboard displays:
- Service health metrics
- Transaction throughput
- Error rates and patterns
- Infrastructure metrics
Detailed trace analysis reveals the complete request journey:
The trace view enables you to:
- Track request flows
- Measure operation durations
- Identify performance bottlenecks
- Monitor error propagation
For detailed span inspection:
The span analysis shows:
- Precise timing measurements
- Request/response payloads
- Error context and details
- Custom metadata and annotations
Use these traces for:
- Debugging issues
- Performance monitoring
- System optimization
- Understanding user interactions
Troubleshooting
• Traces not visible in Elastic APM:
- Ensure that all required environment variables are set correctly
- Verify your
OTEL_EXPORTER_OTLP_ENDPOINT
orLANGTRACE_API_HOST
includes/v1/traces
at the end - Check if your authentication token is valid and properly formatted
- Verify your service name is correctly set in
OTEL_RESOURCE_ATTRIBUTES
• Missing spans or incomplete traces:
- Ensure Langtrace is initialized before any LLM operations
- Check if all required dependencies are installed
- Verify your application has proper permissions to send traces