Langtrace AI Docs home page
Search...
⌘K
Support
Langtrace App
Langtrace App
Search...
Navigation
LLM Frameworks
LlamaIndex
Documentation
GitHub
Getting Started
Introduction
Quickstart
How to Guides
Concepts
SDK
FAQs
Tracing
Tracing Overview
Features
Prompting
Manage Prompts
Tool Calling Schema
Model Playground
Evaluations & Testing
Evaluations
Compare Evaluations
Annotations
Supported Integrations
Overview
LLM & Tools
LLM Frameworks
LlamaIndex
Langchain
CrewAI
DSPy
Mem0
OpenAI Agents SDK
SwarmZero
Vercel AI
LiteLLM
AWS Bedrock
Graphlit
Guardrails
Agno
Cleanlab
Neo4j GraphRAG
Observability Tools
Vector Stores
Web Frameworks
OTEL Exporters
API Reference
Project APIs
Prompt Registry APIs
Traces API
Hosting
Overview
Authentication
Configuration
Traces retention
Recommended Configurations
Hosting Options
Contact Us
Contact Us
On this page
Setup
Usage
LLM Frameworks
LlamaIndex
Langtrace and LlamaIndex Integration Guide
Langtrace integrates directly with
LlamaIndex
, offering detailed, real-time insights into performance metrics such as accuracy, evaluations, and latency.
Setup
Install Langtrace’s SDK and
initialize
the SDK in your code.
Usage
Here’s a quick example of how to use Langtrace with LlamaIndex:
Python Notebook
Google Collab
xAI
Langchain
Assistant
Responses are generated using AI and may contain mistakes.