Langtrace integrates directly with Langchain, offering detailed, real-time insights into performance metrics such as cost, token usage, accuracy, and latency.

Setup

  1. Install Langtrace’s SDK and initialize the SDK in your code.

Note: You’ll need API key from Langtrace. Sign up for Langtrace if you haven’t done so already.

Python
# Install the SDK
pip install -U langtrace-python-sdk langchain langchain-chroma langchainhub

Typescript
npm install @pinecone-database/pinecone-client
  1. Setup environment variables:
Shell
export LANGTRACE_API_KEY=YOUR_LANGTRACE_API_KEY
export OPENAI_API_KEY=YOUR_OPENAI_API_KEY

Usage

Generate a simple output with your deployment’s model:

Create a vector store index and query it

Retrieve and generate using the relevant snippets of the document.

Format our document ,setup a rag chain and query

You can now view your traces on the Langtrace dashboard

Want to see more supported methods? Checkout the sample code in the Langtrace Langchain Python Example