Langtrace AI Docs home page
Search...
⌘K
Support
Langtrace App
Langtrace App
Search...
Navigation
LLM & Tools
Ollama
Documentation
GitHub
Getting Started
Introduction
Quickstart
How to Guides
Concepts
SDK
FAQs
Tracing
Tracing Overview
Features
Prompting
Manage Prompts
Tool Calling Schema
Model Playground
Evaluations & Testing
Evaluations
Compare Evaluations
Annotations
Supported Integrations
Overview
LLM & Tools
Anthropic
Arch
Azure-OpenAI
Cerebras
Cohere
DeepSeek
Gemini
Groq
Mistral AI
KubeAI
Ollama
OpenAI
Perplexity
xAI
LLM Frameworks
Observability Tools
Vector Stores
Web Frameworks
OTEL Exporters
API Reference
Project APIs
Prompt Registry APIs
Traces API
Hosting
Overview
Authentication
Configuration
Traces retention
Recommended Configurations
Hosting Options
Contact Us
Contact Us
On this page
Setup
Usage
LLM & Tools
Ollama
Guide to integrating Ollama with Langtrace for LLM monitoring
This guide will walk you through the process of integrating Ollama, an open-source project for running large language models locally, with Langtrace for monitoring and tracing.
Setup
Install Langtrace’s SDK and
initialize
the SDK in your code.
Usage
Here’s a quick example of how to use Langtrace with Ollama:
Ollama Reference
KubeAI
OpenAI
Assistant
Responses are generated using AI and may contain mistakes.