This guide will walk you through the process of integrating Ollama, an open-source project for running large language models locally, with Langtrace for monitoring and tracing.

Setup

  1. Install Langtrace’s SDK and initialize the SDK in your code.

Usage

Here’s a quick example of how to use Langtrace with Ollama: