Langtrace can be initialized only in the server-side code of NextJS applications. This includes app/api/ pages for app router and pages/api/ for pages router. You can also initialize Langtrace in server side rendered pages.

Get Started

1

First Step

Install the Langtrace SDK in your NextJS application.

npm install @langtrase/typescript-sdk

Update the next.config.(m)js file.

next.config.(m)js
import { ModuleAlias } from '@langtrase/typescript-sdk/dist/webpack/plugins/ModuleAlias.js'
const nextConfig = {
  webpack: (config, { isServer }) => {
    if (isServer) {
        config.resolve.plugins = [
        ...(config.resolve.plugins || []),
        new ModuleAlias(process.cwd())
      ];
      config.module.rules.push({
        loader: "node-loader",
        test: /\.node$/,
      });
      config.ignoreWarnings = [{ module: /opentelemetry/ }];
    }
  return config;
  },
};
2

Second Step

Initialize langtrace in your API routes like so:. Replace <API_KEY> with your Langtrace API key.

Instrumentations

Langtrace supports instrumentations for different LLMs, VectorDBs and Frameworks. The instrumentations object is a key-value pair where the key is the name of the LLM, VectorDB or Framework and the value is the imported module.

You can instrument multiple vendors in your application. For example, you can instrument OpenAI and LlamaIndex in the same application. See LlamaIndex instrumentation for an example.

See initalization for all supported instrumentations below

Replace <API_KEY> with your Langtrace API key.