Tracing is a fundamental feature in LLM application development, enabling developers to gain deep insights into their apps’ performance and behavior. By capturing detailed information such as input and output tokens, response times, model parameters, and user feedback, tracing allows you to optimize your LLM applications effectively. This guide will walk you through the process of sending traces from your application to Langtrace Cloud, helping you leverage this powerful tool for analysis and improvement.
To begin sending traces, you’ll need to set up your Langtrace Cloud account:
You may also create new projects and generate API keys for each of them later.
To start sending traces, you need to install the Langtrace SDK and initialize it with your API key. Follow these steps:
Replace <LANGTRACE_API_KEY>
with the API key you generated in Step 1.
Make sure to initialize Langtrace before importing any LLM modules to ensure all interactions are properly traced.
Once the SDK is initialized, Langtrace will automatically capture traces from supported LLM integrations.
Here’s an example of how to send traces from an LLM application using the OpenAI API. This example demonstrates how to use the Langtrace SDK to send traces from an OpenAI completion request.
Make Sure to pip or npm install the required packages before running the code.
Run the code snippet as shown below:
After setting up Langtrace and running your application, your traces will be automatically sent to Langtrace Cloud, where you can view and analyze them on the dashboard.
Now that you’ve set up Langtrace and are sending traces, you can:
View Your Traces: Log in to your Langtrace Cloud account to view and analyze the traces you’ve sent.
Customize Your Tracing: Explore adding custom attributes or spans to capture more specific data about your LLM operations.
Integrate with Your Workflow: Consider how you can use the insights from your traces to optimize your LLM applications.
Explore Advanced Features: Check out our other guides to learn about features like:
Remember, the more you use Langtrace, the more insights you’ll gain into your LLM applications’ performance and behavior.
If you encounter problems while setting up or using Langtrace, here are solutions to some common issues:
API Key Not Recognized
Ensure you’ve copied the full API key correctly. Verify that it’s either:
LANGTRACE_API_KEY
Langtrace.init()
as shown in Step 2If the issue persists, try regenerating your API key in the Langtrace dashboard.
Traces Not Appearing in Dashboard
If your traces aren’t showing up in the Langtrace dashboard:
You can enable console logging to verify if traces are being sent:
SDK Import Errors
If you’re seeing import errors:
Automatic Instrumentation Not Working
If automatic instrumentation for supported LLM libraries isn’t working:
instrumentations
option (TypeScript SDK only):If you’re still experiencing issues after trying these solutions, please contact our support team.