Setup

  1. Install Langtrace’s SDK and initialize the SDK in your code.

Note: You’ll need API keys from Langtrace and Pinecone. Sign up for Langtrace and/or Pinecone if you haven’t done so already.

Python
# Install the SDK
pip install -U langtrace-python-sdk pinecone
Typescript
npm install @pinecone-database/pinecone-client
  1. Setup environment variables:
Shell
export LANGTRACE_API_KEY=YOUR_LANGTRACE_API_KEY
export PINECONE_API_KEY=YOUR_PINECONE_API_KEY

Usage

Generate a simple output with your deployment’s model:

import os
   from langtrace_python_sdk import langtrace # Must precede any llm module imports
langtrace.init(api_key = os.environ['LANGTRACE_API_KEY'])
from pinecone import Pinecone, ServerlessSpec
pc = Pinecone(api_key=os.environ['PINECONE_API_KEY'])

# Create an Index and upsert some data in Pinecone:
pc.create_index(
    name="index",
    dimension=8, # Replace with your model dimensions
    metric="euclidean", # Replace with your model metric
    spec=ServerlessSpec(
        cloud="aws",
        region="us-east-1"
    )
)
index = pc.Index("index")
index.upsert(
  vectors=[
    {"id": "A", "values": [0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1]},
    {"id": "B", "values": [0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2]},
    {"id": "C", "values": [0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3, 0.3]},
    {"id": "D", "values": [0.4, 0.4, 0.4, 0.4, 0.4, 0.4, 0.4, 0.4]}
  ]
)

You can now view your traces on the Langtrace dashboard:

Want to see more supported methods? Checkout the sample code in the Langtrace Pinecone Python Example repository.