Welcome to the Langtrace AI documentation
Langtrace is an open-source observability tool that collects and analyze traces in order to help you improve your LLM apps. Langtrace has two components:
Python SDK
Typescript SDK
trace-attributes
Langtrace
OpenTelemetry: The traces generated by Langtrace are based on the OpenTelemetry standard. This means that you can use Langtrace with your existing observability stack (No Langtrace API key required.), or export the traces to Langtrace.
Langtrace optimizes for the following 3 pillars of observability for your LLM apps:
Need help with the SDK? Contact us
The first step to using Langtrace is to import into your Typescript or Python project.
Signup to Langtrace here
Go to the Langtrace dashboard
Create a new project, be sure to provide a name and description
Click on the Project name
Click on the Generate API Key
button
Copy the generated API key. You will need this key to initialize the SDK in your project.
You can now view your traces on the Langtrace dashboard.
Dive into the SDK features. Here are the supported languages
We are working hard to add support for more languages.
Langtrace offers flexible pricing options to suit various needs and usage levels. Here’s an overview of our pricing structure:
Ideal for individuals or hackers just beginning their LLM observability journey.
Designed for growing teams and projects with increasing observability needs.
Tailored solutions for large-scale enterprises with complex requirements.
Perfect for developers who prioritize data control and customization.
A span represents a single unit of work or operation in your application. For LLM apps, this typically corresponds to an API call or a specific step in your language model pipeline.
If you’re just getting started or want to self-host, our Self-Hosted option is perfect. For small teams or projects using our managed service, the Starter plan is ideal. As your usage grows, the Growth plan offers more flexibility and features. Large enterprises with specific needs should consider our Scale plan.
Yes, you can change your plan at any time to match your current needs.
For the Growth plan, you’ll be charged for additional spans above the 50K limit. For other plans, please contact our sales team [email protected] to discuss options.
Please contact our sales team to discuss trial options for our paid features.
To discuss specific requirements or to learn more about our Enterprise offerings, please contact us at [email protected].
Welcome to the Langtrace AI documentation
Langtrace is an open-source observability tool that collects and analyze traces in order to help you improve your LLM apps. Langtrace has two components:
Python SDK
Typescript SDK
trace-attributes
Langtrace
OpenTelemetry: The traces generated by Langtrace are based on the OpenTelemetry standard. This means that you can use Langtrace with your existing observability stack (No Langtrace API key required.), or export the traces to Langtrace.
Langtrace optimizes for the following 3 pillars of observability for your LLM apps:
Need help with the SDK? Contact us
The first step to using Langtrace is to import into your Typescript or Python project.
Signup to Langtrace here
Go to the Langtrace dashboard
Create a new project, be sure to provide a name and description
Click on the Project name
Click on the Generate API Key
button
Copy the generated API key. You will need this key to initialize the SDK in your project.
You can now view your traces on the Langtrace dashboard.
Dive into the SDK features. Here are the supported languages
We are working hard to add support for more languages.
Langtrace offers flexible pricing options to suit various needs and usage levels. Here’s an overview of our pricing structure:
Ideal for individuals or hackers just beginning their LLM observability journey.
Designed for growing teams and projects with increasing observability needs.
Tailored solutions for large-scale enterprises with complex requirements.
Perfect for developers who prioritize data control and customization.
A span represents a single unit of work or operation in your application. For LLM apps, this typically corresponds to an API call or a specific step in your language model pipeline.
If you’re just getting started or want to self-host, our Self-Hosted option is perfect. For small teams or projects using our managed service, the Starter plan is ideal. As your usage grows, the Growth plan offers more flexibility and features. Large enterprises with specific needs should consider our Scale plan.
Yes, you can change your plan at any time to match your current needs.
For the Growth plan, you’ll be charged for additional spans above the 50K limit. For other plans, please contact our sales team [email protected] to discuss options.
Please contact our sales team to discuss trial options for our paid features.
To discuss specific requirements or to learn more about our Enterprise offerings, please contact us at [email protected].