Introduction

Langtrace provides native support for DeepSeek’s family of models. When using DeepSeek models, Langtrace automatically captures essential metrics including:

  • Token usage
  • Cost
  • Latency
  • Model hyperparameters

Prerequisites

Before integrating DeepSeek with Langtrace, ensure you have:

  • A Langtrace account with an API key
  • DeepSeek API credentials
  • Python environment (3.6 or later)

Installation

  1. Install the Langtrace Python SDK:
pip install -U langtrace-python-sdk
  1. Set up your environment variables:
export LANGTRACE_API_KEY=YOUR_LANGTRACE_API_KEY

Integration

Initialize Langtrace before importing any LLM modules:

Viewing Traces

After integration, you can view your DeepSeek model traces in the Langtrace dashboard. The traces will include:

  • Request/response payloads
  • Token usage metrics
  • Cost information
  • Latency measurements
  • Model configuration details

Additional Resources