Using Langtrace to monitor your Groq apps is quick and easy. Follow these steps:

Setup

  1. Install the Langtrace and Groq SDKs.

Note: You’ll need API keys from Langtrace and Groq. Sign up for Langtrace and/or Groq if you haven’t done so already.

Python
# Install the SDK
pip install -U langtrace-python-sdk groq
  1. Setup environment variables:
Shell
export LANGTRACE_API_KEY=YOUR_LANGTRACE_API_KEY
export GROQ_API_KEY=YOUR_GROQ_API_KEY

Usage

Generate a simple output with your deployment’s model:

import os
from langtrace_python_sdk import langtrace # Must precede any llm module imports
langtrace.init(api_key = os.environ['LANGTRACE_API_KEY'])
from groq import Groq
client = Groq(
  # This is the default and can be omitted
  api_key=os.environ["GROQ_API_KEY"],
)

# Generate a simple output with the llama3 MODEL
chat_completion = client.chat.completions.create(
  messages=[
      {
          "role": "user",
          "content": "What is LangChain?",
      }
  ],
  model="llama3-8b-8192",
)
print(chat_completion.choices[0].message.content)

You can now view your traces on the Langtrace dashboard: traces

Want to see more supported methods? Checkout the sample code in the Langtrace Groq Python Example repository.

Using Langtrace to monitor your Groq apps is quick and easy. Follow these steps:

Setup

  1. Install the Langtrace and Groq SDKs.

Note: You’ll need API keys from Langtrace and Groq. Sign up for Langtrace and/or Groq if you haven’t done so already.

Python
# Install the SDK
pip install -U langtrace-python-sdk groq
  1. Setup environment variables:
Shell
export LANGTRACE_API_KEY=YOUR_LANGTRACE_API_KEY
export GROQ_API_KEY=YOUR_GROQ_API_KEY

Usage

Generate a simple output with your deployment’s model:

import os
from langtrace_python_sdk import langtrace # Must precede any llm module imports
langtrace.init(api_key = os.environ['LANGTRACE_API_KEY'])
from groq import Groq
client = Groq(
  # This is the default and can be omitted
  api_key=os.environ["GROQ_API_KEY"],
)

# Generate a simple output with the llama3 MODEL
chat_completion = client.chat.completions.create(
  messages=[
      {
          "role": "user",
          "content": "What is LangChain?",
      }
  ],
  model="llama3-8b-8192",
)
print(chat_completion.choices[0].message.content)

You can now view your traces on the Langtrace dashboard: traces

Want to see more supported methods? Checkout the sample code in the Langtrace Groq Python Example repository.