Introduction
Langtrace provides native support for DeepSeek’s family of models. When using DeepSeek models, Langtrace automatically captures essential metrics including:- Token usage
- Cost
- Latency
- Model hyperparameters
Prerequisites
Before integrating DeepSeek with Langtrace, ensure you have:- A Langtrace account with an API key
- DeepSeek API credentials
- Python environment (3.6 or later)
Installation
- Install the Langtrace Python SDK:
- Set up your environment variables:
Integration
Initialize Langtrace before importing any LLM modules:Viewing Traces
After integration, you can view your DeepSeek model traces in the Langtrace dashboard. The traces will include:- Request/response payloads
- Token usage metrics
- Cost information
- Latency measurements
- Model configuration details