Using Langtrace to monitor your Perplexity LLM apps is quick and easy. Follow these steps:

Setup

  1. Install the Langtrace and Perplexity SDKs.

Note: You’ll need API keys from Langtrace and Perplexity. Sign up for Langtrace and/or Perplexity if you haven’t done so already.

Python
# Install the SDK
pip install -U langtrace-python-sdk openai
  1. Setup environment variables:
Shell
export LANGTRACE_API_KEY=YOUR_LANGTRACE_API_KEY
export PERPLEXITY_API_KEY=YOUR_PERPLEXITY_API_KEY

Usage

Generate a simple output with your deployment’s model:

import os
from langtrace_python_sdk import langtrace # Must precede any llm module imports
langtrace.init(api_key = os.environ['LANGTRACE_API_KEY'])

from openai import OpenAI
## NOTE that perplexity is open ai client compatible which is why we will use the OPEN AI client


# Generate a simple output with the Mistral 7b MODEL
messages = [
  {
      "role": "system",
      "content": (
          "You are an artificial intelligence assistant and you need to "
          "engage in a helpful, detailed, polite conversation with a user."
      ),
  },
  {
      "role": "user",
      "content": (
          "Count to 100, with a comma between each number and no newlines. "
          "E.g., 1, 2, 3, ..."
      ),
  },
]

client = OpenAI(api_key=os.environ['PPLX_API_KEY'], base_url="https://api.perplexity.ai")

response = client.chat.completions.create(
  model="mistral-7b-instruct",
  messages=messages,
)

You can now view your traces on the Langtrace dashboard:

Want to see more supported methods? Checkout the sample code in the Langtrace Perplexity Python Example repository.