NextJS
Langtrace can be easily integrated with NextJS applications with just two steps.
Langtrace can be initialized only in the server-side code of NextJS
applications. This includes app/api/
pages for app router and pages/api/
for pages router. You can also initialize Langtrace in server side rendered
pages.
Get Started
First Step
Install the Langtrace SDK in your NextJS application.
Update the next.config.(m)js file.
Second Step
Initialize langtrace in your API routes like so:. Replace <API_KEY>
with your Langtrace API key.
Instrumentations
Langtrace supports instrumentations for different LLMs, VectorDBs and Frameworks. The instrumentations object is a key-value pair where the key is the name of the LLM, VectorDB or Framework and the value is the imported module.
You can instrument multiple vendors in your application. For example, you can instrument OpenAI and LlamaIndex in the same application. See LlamaIndex instrumentation for an example.
See initalization for all supported instrumentations below
Replace <API_KEY>
with your Langtrace API key.
Vercel AI SDK
Vercel AI SDK
LlamaIndex
LlamaIndex
Anthropic
Anthropic
OpenAI
OpenAI
Pinecone
Pinecone
ChromaDB
ChromaDB
Cohere
Cohere
groq
groq
pg
pg
qdrant
qdrant
weaviate
weaviate
Recipes
To get started with integrating Langtrace in your NextJS application, you can refer to these NextJS examples in our recipes repository: NextJS Integration Examples