Using Langtrace to monitor your Cohere backed LLM apps is quick and easy. Follow these steps:

Setup

  1. Install Langtrace’s SDK and initialize the SDK in your code.

Note: You’ll need API keys from Langtrace and Cohere. Sign up for Langtrace and/or Cohere if you haven’t done so already.

Python
# Install the SDK
pip install -U langtrace-python-sdk cohere
  1. Setup environment variables:
Shell
export LANGTRACE_API_KEY=YOUR_LANGTRACE_API_KEY
export COHERE_API_KEY=YOUR_COHERE_API_KEY

Usage

Generate a simple output with your deployment’s model:

# Imports import os
import os
from langtrace_python_sdk import langtrace # Must precede any llm module imports
langtrace.init(api_key = os.environ['LANGTRACE_API_KEY'])
import cohere
co = cohere.Client(
    api_key=os.environ['COHERE_API_KEY'],
)

#  Lets query Cohere's command model
chat = co.chat(
    message="What is Langchain?",
    model="command"
)
print(chat.text)

You can now view your traces on the Langtrace dashboard:

Want to see more supported methods? Checkout the sample code in the Langtrace Cohere Python Example repository.