Using Langtrace to monitor your Gemini backed LLM apps is quick and easy. Follow these steps:

Setup

  1. Install Langtrace’s SDK and initialize the SDK in your code.

Note: You’ll need API keys from Langtrace and Gemini. Sign up for Langtrace and/or Gemini if you haven’t done so already.

Python
# Install the SDK
pip install -U langtrace-python-sdk
Shell
pip install -q -U google-generativeai
  1. Setup environment variables:
Shell
export LANGTRACE_API_KEY=YOUR_LANGTRACE_API_KEY
export GEMINI_API_KEY=YOUR_GEMINI_API_KEY

Usage

Generate a simple output with your deployment’s model:

import google.generativeai as genai
import os
from langtrace_python_sdk import langtrace, with_langtrace_root_span  # Must precede any llm module imports

langtrace.init(api_key=os.environ["LANGTRACE_API_KEY"])


@with_langtrace_root_span("chat_complete")
def chat_complete():
    model = genai.GenerativeModel("gemini-1.5-flash")
    genai.configure(api_key=os.environ["GEMINI_API_KEY"])

    chat_response = genai.chat.complete(
        model=model,
        response=model.generate_content("Write a story about a magic backpack.")

    )
    print(chat_response.text)
    print(chat_response.choices[0].response.content)

chat_complete()