Learn how to configure the OpenTelemetry Collector integration.
The following is a quick guide on how to run an OpenTelemetry Collector with a custom config to forward traces to Langtrace either self-hosted or on Cloud.
Save the following to a file named otel.yaml
and replace <LANGTRACE_API_KEY>
with your Langtrace API key.
If you are using a self-hosted setup, you will need to log in as an admin user first to create a project and generate an API key.
The Langtrace client processes only JSON-encoded data. Please make sure to use an exporter that can encode the data in json format and does not use any compression.
https://SELF_HOSTED_URL/api/trace
if you are sending your traces to self hosted Langtrace backend. For example, for localhost, the endpoint would be http://localhost:3000/api/trace
. If you are running the collector along with the Langtrace’s docker compose setup, the endpoint would be http://langtrace:3000/api/trace
i.e. the name of the Langtrace client container name specified in the docker compose file.You can add debug
as an exporter to the config if you wish to debug the
working of the collector.
and append the debug
exporter to the traces pipeline.
The following is a sample docker-compose.yaml
file that runs the OpenTelemetry Collector with a custom configuration file. The configuration file is mounted as a volume in the container.
Use this as a reference to run the OpenTelemetry Collector with a custom configuration file.
Learn how to configure the OpenTelemetry Collector integration.
The following is a quick guide on how to run an OpenTelemetry Collector with a custom config to forward traces to Langtrace either self-hosted or on Cloud.
Save the following to a file named otel.yaml
and replace <LANGTRACE_API_KEY>
with your Langtrace API key.
If you are using a self-hosted setup, you will need to log in as an admin user first to create a project and generate an API key.
The Langtrace client processes only JSON-encoded data. Please make sure to use an exporter that can encode the data in json format and does not use any compression.
https://SELF_HOSTED_URL/api/trace
if you are sending your traces to self hosted Langtrace backend. For example, for localhost, the endpoint would be http://localhost:3000/api/trace
. If you are running the collector along with the Langtrace’s docker compose setup, the endpoint would be http://langtrace:3000/api/trace
i.e. the name of the Langtrace client container name specified in the docker compose file.You can add debug
as an exporter to the config if you wish to debug the
working of the collector.
and append the debug
exporter to the traces pipeline.
The following is a sample docker-compose.yaml
file that runs the OpenTelemetry Collector with a custom configuration file. The configuration file is mounted as a volume in the container.
Use this as a reference to run the OpenTelemetry Collector with a custom configuration file.