Overview

Elastic APM is a powerful solution for monitoring the performance of your applications. This guide focuses on integrating Langtrace AI with Elastic APM for distributed tracing. By leveraging Elastic’s capabilities, you can analyze and visualize trace data from Langtrace AI.

Prerequisites

Before you begin, ensure you have the following:

Setup

Generate Elastic APM API Token

To generate the Elastic APM API token:

  • Login to your Elastic deployment then click Observability
  • Navigate to the Integrations tab then APM

Elastic

  • Select OpenTelemetry then set the following environment variables in your terminal or .env file:
export OTEL_SERVICE_NAME=your-elastic-service
export OTEL_RESOURCE_ATTRIBUTES=service.name=your-elastic-service
export OTEL_EXPORTER_OTLP_ENDPOINT="<https://your-elastic-apm-endpoint:443/v1/traces>"
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer your-elastic-apm-token"

Replace the following values:

  • your_elastic_apm_token with your actual Elastic APM API token.
  • your_elastic_apm_endpoint with your endpoint values.
  • your-app-name with your application/service name.
  • your-elastic-service with your service name.

Note: The OTEL_EXPORTER_OTLP_HEADERS should include the full header string, including the “Authorization=Bearer” prefix.

Install the Instrumentation Library

If you haven’t already, install the latest release of the Elastic OpenTelemetry Python package:

pip install -U opentelemetry-exporter-otlp-proto-http langtrace_python_sdk

Add the custom exporter to your code

Make sure you have set the environment variables as described above.

Add the following code snippet to your Python application:

from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Set up Elastic exporter
headers = dict(item.split("=") for item in OTEL_EXPORTER_OTLP_HEADERS.split(",")) if OTEL_EXPORTER_OTLP_HEADERS else {}

elastic_exporter = OTLPSpanExporter(
    endpoint=OTEL_EXPORTER_OTLP_ENDPOINT,
    headers=headers
)

Initialize the Langtrace SDK

Initialize the Langtrace SDK in your application:

from langtrace_python_sdk import langtrace
# Initialize Langtrace with Elastic exporter
langtrace.init(
    custom_remote_exporter=elastic_exporter,
    batch=True)

Run the Application

With the environment variables set, run your application with OpenTelemetry instrumentation. Use the following command:

opentelemetry-instrument python main.py

Verifying the Setup

Once the application is running, you should see the output of your query printed to the console. Additionally, you should see traces in your Elastic APM dashboard corresponding to the operations performed in your script. This integration is made possible due to Langtrace’s support for OpenTelemetry, which enables seamless tracing and observability.

Elastic

An Example Application

Here is an example Python application that uses Langtrace AI and Elastic APM:

You will need to install the following libraries:

pip install -U opentelemetry-exporter-otlp-proto-http langtrace_python_sdk langchain-openai langchain-community duckduckgo-search duckduckgo-search python-dotenv

Here is the example code. Check out our Langtrace Recipes repository for more examples:

Troubleshooting

Traces not visible in Elastic APM:

  • Ensure that all required environment variables are set correctly. Double-check the variable names and values.
  • Also ensure that your OTEL_EXPORTER_OTLP_ENDPOINT is correct. You need to add /v1/traces at the end of your endpoint.

Missing Libraries: Verify that all necessary libraries are installed. If you encounter any errors, reinstall the libraries using the provided commands.

Additional Resources