Langtrace can be easily integrated with NextJS applications with just two steps.
Langtrace can be initialized only in the server-side code of NextJS
applications. This includes app/api/ pages for app router and pages/api/
for pages router. You can also initialize Langtrace in server side rendered
pages.
Langtrace supports instrumentations for different LLMs, VectorDBs and Frameworks. The instrumentations object is a key-value pair where the key is the name of the LLM, VectorDB or Framework and the value is the imported module.
You can instrument multiple vendors in your application. For example, you can instrument OpenAI and LlamaIndex in the same application. See LlamaIndex instrumentation for an example.
See initalization for all supported instrumentations belowReplace <API_KEY> with your Langtrace API key.
Vercel AI SDK
Copy
import * as Langtrace from '@langtrase/typescript-sdk';import * as ai from 'ai';Langtrace.init({api_key: '<API_KEY>',instrumentations: { ai: ai,},});
LlamaIndex
Copy
import * as Langtrace from '@langtrase/typescript-sdk';import * as llamaindex from 'llamaindex';import * as openai from 'openai';Langtrace.init({api_key: '<API_KEY>',instrumentations: { llamaindex: llamaindex, openai: openai,},});
Anthropic
Copy
import * as Langtrace from "@langtrase/typescript-sdk";import * as anthropic from "@anthropic-ai/sdk";/*** Ensure you have ANTHROPIC_API_KEY in your environment variables and LANGTRACE_API_KEY in your environment variables*/Langtrace.init({api_key: '<API_KEY>',instrumentations: { anthropic: anthropic,},});
OpenAI
Typescript
Copy
import * as Langtrace from "@langtrase/typescript-sdk";import * as openai from "openai";/** * Ensure you have OPENAI_API_KEY in your environment variables and LANGTRACE_API_KEY in your environment variables */Langtrace.init({ api_key: '<API_KEY>', instrumentations: { openai: openai, },});
Pinecone
Copy
import * as Langtrace from "@langtrase/typescript-sdk";import * as pinecone from "@pinecone-database/pinecone";import * as openai from "openai";/** * Ensure you have PINECONE_API_KEY in your environment variables and LANGTRACE_API_KEY in your environment variables */Langtrace.init({ api_key: '<API_KEY>', instrumentations: { pinecone: pinecone, openai: openai, },});
ChromaDB
Copy
import * as Langtrace from "@langtrase/typescript-sdk";import * as chroma from "chromadb";import * as openai from "openai";/** * Ensure you have installed chromadb * Make sure you have pipx installed. * python3 -m venv ./app/api/chroma/chromaenv * source ./app/api/chroma/chromaenv/bin/activate * pipx install chromadb * chroma run --path ./app/api/chroma/db * Ensure you have OPENAI_API_KEY in your environment variables and LANGTRACE_API_KEY in your environment variables */Langtrace.init({ api_key: '<API_KEY>', instrumentations: { openai: openai, chromadb: chroma, },});
Cohere
Copy
import * as Langtrace from "@langtrase/typescript-sdk";import * as cohere from 'cohere-ai'Langtrace.init({ api_key: '<API_KEY>', instrumentations: { cohere: cohere },});
groq
Copy
import * as Langtrace from "@langtrase/typescript-sdk";import * as groq from 'groq-sdk'Langtrace.init({ api_key: '<API_KEY>', instrumentations: { groq: groq },});
pg
Copy
import * as Langtrace from "@langtrase/typescript-sdk";import * as pg from 'pg'Langtrace.init({ api_key: '<API_KEY>', instrumentations: { pg: pg },});
qdrant
Copy
import * as Langtrace from "@langtrase/typescript-sdk";import * as qdrant from '@qdrant/js-client-rest'/*** To run qdrant locally using docker.* 1. docker pull qdrant/qdrant* 2. docker run -p 6333:6333 -p 6334:6334 \ -v $(pwd)/qdrant_storage:/qdrant/storage:z \ qdrant/qdrant**/Langtrace.init({ api_key: '<API_KEY>', instrumentations: { qdrant: qdrant },});
weaviate
Copy
import * as Langtrace from "@langtrase/typescript-sdk";import * as weaviate from 'weaviate-ts-client'Langtrace.init({ api_key: '<API_KEY>', instrumentations: { weaviate: weaviate },});
To get started with integrating Langtrace in your NextJS application, you can refer to these NextJS examples in our recipes repository:
NextJS Integration Examples