Quickstart
Start shipping traces to Langtrace Cloud or your preferred OpenTelemetry-compatible backend in under 5 minutes!
Introduction
Langtrace offers flexible options for trace collection and analysis:
- Langtrace Cloud βοΈ: Our managed SaaS solution for easy setup and immediate insights.
- Self-hosted Langtrace π : For organizations that prefer to host Langtrace on their own infrastructure.
- OpenTelemetry Integration π: Langtrace SDK supports sending traces to any OpenTelemetry-compatible backend (Datadog, New Relic, Grafana, etc), allowing you to use your existing observability stack.
Important: When using Langtrace with third-party OpenTelemetry-compatible vendors, you donβt need to generate a Langtrace API key. The SDK can be configured to send traces directly to your preferred backend.
Choose the option that best fits your needs and follow the corresponding setup instructions below.
Langtrace Cloud βοΈ
To use the managed SaaS version of Langtrace, follow these steps:
- Sign up by going to this link.
- Create a new Project after signing up. Projects are containers for storing traces and metrics generated by your application. If you have only one application, creating 1 project will do.
- Generate an API key. This key will be used to authenticate your application with Langtrace Cloud.
You may also create new projects and generate API keys for each of them later.
- In your application, install the Langtrace SDK. The code for installing the SDK is shown below:
- Initialize the Langtrace SDK with the API key you generated in the step 3. The code for setting up the SDK is shown below:
- You can now use Langtrace in your code. Hereβs a simple example:
Congrats! You can now view your traces on Langtrace Cloud. π
OpenTelemetry Integration π
If you prefer to use Langtrace with your existing OpenTelemetry-compatible backend:
- Install the Langtrace SDK as described in the Langtrace Cloud section.
- Instead of initializing with a Langtrace API key, configure the SDK to use your OpenTelemetry exporter. Refer to our documentation for OpenTelemetry tools. Or consult your platform of choice.
Note: When shipping traces directly to other observability tools (e.g., Datadog, Instana, New Relic), you do not need a Langtrace API key. Only the API key for your chosen observability tool is required.
Langtrace Self-hosted π
For users/organizations that want to host Langtrace on their own infrastructure, follow these steps to get started.
Configure Langtrace SDK
For more details on configuring the Langtrace SDK, refer to the page Langtrace SDK Features
Parameter | Type | Default Value | Description |
---|---|---|---|
batch | bool | True | Whether to batch spans before sending them. |
api_key | str | LANGTRACE_API_KEY or None | The API key for authentication. |
write_spans_to_console | bool | False | Whether to write spans to the console. |
custom_remote_exporter | Optional[Exporter] | None | Custom remote exporter. If None , a default LangTraceExporter will be used. |
api_host | Optional[str] | https://langtrace.ai/ | The API host for the remote exporter. |
service_name | Optional[str] | None | The Service name for initializing langtrace |
disable_instrumentations | Optional [DisableInstrumentations] | None | You can pass an object to disable instrumentation for specific vendors, e.g., {'only': ['openai']} or {'all_except': ['openai']} . |