Using Langtrace to monitor your xAI apps is quick and easy. Follow these steps:

Setup

  1. Install the Langtrace and xAI SDKs.

Note: You’ll need API keys from Langtrace and xAI. Sign up for Langtrace and/or xAI if you haven’t done so already.

Python
# Install the SDK
pip install -U langtrace-python-sdk openai
  1. Setup environment variables:
Shell
export LANGTRACE_API_KEY=YOUR_LANGTRACE_API_KEY
export XAI_API_KEY=YOUR_OPENAI_API_KEY

Usage

Generate a simple output with your deployment’s model:

import os
from langtrace_python_sdk import langtrace # Must precede any llm module imports
langtrace.init(api_key = os.environ['LANGTRACE_API_KEY'])
from openai import OpenAI
client = OpenAI(
  # This is the default and can be omitted
  api_key=os.environ["XAI_API_KEY"],
  base_url="https://api.x.ai/v1",
)

# Generate a simple output with the llama3 MODEL

chat_completion = client.chat.completions.create(
  messages=[
      {
          "role": "user",
          "content": "What is LangChain?",
      }
  ],
  model="grok-beta",
)
print(chat_completion.choices[0].message.content)

You can now view your traces on the Langtrace dashboard: