# Langtrace AI Docs ## Docs - [POST Project](https://docs.langtrace.ai/api-reference/project/POST-project.md): This endpoint enables creation of a new project - [POST Project API Key](https://docs.langtrace.ai/api-reference/project/POST-project-api-key.md): This endpoint enables creation of an api key for an existing project - [GET Prompt from Registry](https://docs.langtrace.ai/api-reference/prompt-registry/GET-prompt-registry.md): This endpoint enables queries on the prompt registry. You can query for a specific prompt version or get the currently live prompt. Optionally also pass in variables to fill the prompt value variables with data - [Send Trace](https://docs.langtrace.ai/api-reference/traces/POST-trace.md): Send OpenTelemetry-compatible traces to Langtrace - [POST Traces](https://docs.langtrace.ai/api-reference/traces/POST-traces.md): This endpoint lets you download traces for a project. - [Concepts](https://docs.langtrace.ai/concepts.md) - [Contact Us](https://docs.langtrace.ai/contact/contact-us.md) - [FAQs](https://docs.langtrace.ai/faqs.md): Welcome to the Langtrace AI FAQs page. Here you will find answers to the most frequently asked questions about Langtrace. - [Annotations](https://docs.langtrace.ai/features/annotations.md): Langtrace allows you to create tests and manually annotate your traces to evaluate the performance and create datasets for running evaluations. - [Compare Evaluations](https://docs.langtrace.ai/features/compare-evaluations.md): Langtrace allows you compare evaluations run on different models to understand the performance of your application across different models. - [Evaluations](https://docs.langtrace.ai/features/evaluations.md): Langtrace allows you run evaluations on annotated datasets and get insights on the performance of your application. - [Manage Prompts](https://docs.langtrace.ai/features/manage_prompts.md): Langtrace lets you store and version prompts, making it easy to manage and reuse them across your applications. - [Manage Zod Schema for Tool Calling](https://docs.langtrace.ai/features/manage_zod_schema.md): Langtrace lets you store and version Zod schemas for tool calling, making it easy to manage and reuse them across your applications. - [Model Playground](https://docs.langtrace.ai/features/playground.md): Langtrace has a model playground where you can interate on your prompts with different models and model settings to see how they perform. - [Authentication](https://docs.langtrace.ai/hosting/auth.md): Configure various authentication methods for your self-hosted Langtrace setup using NextAuth.js. - [Langtrace Setup on Azure](https://docs.langtrace.ai/hosting/hosting_options/azure.md): Host Langtrace on Azure AppService - [Docker Compose Setup](https://docs.langtrace.ai/hosting/hosting_options/docker-compose.md): Setup langtrace using docker compose - [Docker Setup](https://docs.langtrace.ai/hosting/hosting_options/docker-standalone.md): Run Langtrace App using Docker - [Langtrace on Kubernetes](https://docs.langtrace.ai/hosting/hosting_options/kubernetes.md): Setup langtrace using helm chart - [Langtrace on Railway](https://docs.langtrace.ai/hosting/hosting_options/railway.md): Running Langtrace Client on Railway - [Overview](https://docs.langtrace.ai/hosting/overview.md): You can self-host Langtrace to monitor your applications. This guide will help you set up Langtrace on your own infrastructure. - [Recommended Configurations](https://docs.langtrace.ai/hosting/recommended_configurations.md): Recommended hardware configurations for running Langtrace on your own cloud infrastructure. - [Traces retention](https://docs.langtrace.ai/hosting/traces_retention.md) - [Langtrace Self-Hosting Configuration](https://docs.langtrace.ai/hosting/using_local_setup.md): Learn how to use Langtrace SDK for Self-Hosted Setup - [How to Guides](https://docs.langtrace.ai/how-to-guides.md): Collection of guides to help you get started with Langtrace. - [Introduction](https://docs.langtrace.ai/introduction.md): Welcome to the Langtrace AI documentation - [Quickstart](https://docs.langtrace.ai/quickstart.md): Start shipping traces to Langtrace Cloud or your preferred OpenTelemetry-compatible backend in under 5 minutes! - [Langtrace Python SDK](https://docs.langtrace.ai/sdk/python_sdk.md) - [Langtrace SDK Features](https://docs.langtrace.ai/sdk/sdk_features.md) - [Langtrace Typescript SDK](https://docs.langtrace.ai/sdk/typescript_sdk.md) - [Agno](https://docs.langtrace.ai/supported-integrations/llm-frameworks/agno.md): Learn how to use Langtrace with Agno for building multi-modal agents and tracing - [AWS Bedrock](https://docs.langtrace.ai/supported-integrations/llm-frameworks/bedrock.md): Langtrace and AWS Bedrock Integration Guide - [Cleanlab](https://docs.langtrace.ai/supported-integrations/llm-frameworks/cleanlab.md): Integrate Cleanlab TLM with Langtrace for LLM observability - [CrewAI](https://docs.langtrace.ai/supported-integrations/llm-frameworks/crewai.md): Langtrace and CrewAI Integration Guide - [DSPy](https://docs.langtrace.ai/supported-integrations/llm-frameworks/dspy.md): Langtrace and DSPy Integration Guide - [Graphlit](https://docs.langtrace.ai/supported-integrations/llm-frameworks/graphlit.md): Langtrace and Graphlit Integration Guide - [Guardrails](https://docs.langtrace.ai/supported-integrations/llm-frameworks/guardrails.md): Learn how to use Langtrace with Guardrails AI for enhanced LLM validation and tracing - [Langchain](https://docs.langtrace.ai/supported-integrations/llm-frameworks/langchain.md): Langtrace and Langchain Integration Guide - [LiteLLM](https://docs.langtrace.ai/supported-integrations/llm-frameworks/litellm.md): Langtrace and LiteLLM Integration Guide - [LlamaIndex](https://docs.langtrace.ai/supported-integrations/llm-frameworks/llamaindex.md): Langtrace and LlamaIndex Integration Guide - [Mem0](https://docs.langtrace.ai/supported-integrations/llm-frameworks/mem0.md): Langtrace and Mem0 Integration Guide - [Neo4j GraphRAG](https://docs.langtrace.ai/supported-integrations/llm-frameworks/neo4j-graphrag.md): Learn how to use Langtrace with Neo4j GraphRAG for Retrieval Augmented Generation with knowledge graphs - [OpenAI Agents SDK](https://docs.langtrace.ai/supported-integrations/llm-frameworks/openai-agents-sdk.md): Langtrace and OpenAI Agents SDK Integration Guide with OTEL Compatibility - [SwarmZero](https://docs.langtrace.ai/supported-integrations/llm-frameworks/swarmzero.md): Langtrace and SwarmZero Integration Guide - [Vercel AI SDK](https://docs.langtrace.ai/supported-integrations/llm-frameworks/vercelaisdk.md): Langtrace and Vercel AI SDK Integration Guide - [Anthropic](https://docs.langtrace.ai/supported-integrations/llm-tools/anthropic.md): Anthropic is an AI research organization behind Claude model family. Whether you're brainstorming alone or building with a team of thousands, Claude is here to help. - [Arch](https://docs.langtrace.ai/supported-integrations/llm-tools/arch.md): Arch is an intelligent gateway designed to protect, observe, and personalize AI agents with your APIs. Engineered with purpose-built LLMs, Arch handles the critical but undifferentiated tasks related to the handling and processing of prompts, including detecting and rejecting jailbreak attempts, int… - [Azure-OpenAI](https://docs.langtrace.ai/supported-integrations/llm-tools/azure-openai.md): Azure OpenAI Service integrates OpenAI's advanced AI models, like GPT-4, with Azure's secure cloud infrastructure, enabling businesses to enhance their applications with sophisticated language processing capabilities. It provides scalable, customizable, and enterprise-grade AI solutions while ensuri… - [Cerebras](https://docs.langtrace.ai/supported-integrations/llm-tools/cerebras.md): Learn how to use Langtrace with Cerebras for AI workload observability - [Cohere](https://docs.langtrace.ai/supported-integrations/llm-tools/cohere.md): Cohere is an AI company specializing in advanced natural language processing (NLP) technologies. They provide powerful language models and easy-to-use APIs for text generation, sentiment analysis, summarization, and more. Cohere's solutions can be customized and fine-tuned for specific tasks, making… - [DeepSeek](https://docs.langtrace.ai/supported-integrations/llm-tools/deepseek.md): Learn how to integrate Langtrace with DeepSeek's family of models - [Gemini](https://docs.langtrace.ai/supported-integrations/llm-tools/gemini.md): Gemini is an AI-driven platform that enhances conversational experiences by enabling seamless user-system interactions. Integrated with Langtrace, it provides deep language tracing and insights into dialogue patterns. This combination helps optimize AI-driven conversations for improved efficiency an… - [Groq](https://docs.langtrace.ai/supported-integrations/llm-tools/groq.md): Groq provides cutting-edge AI acceleration hardware and software solutions designed to maximize the performance and efficiency of machine learning and AI workloads. Their innovative technology enables rapid processing speeds and scalability, making it ideal for complex data-intensive applications. G… - [KubeAI](https://docs.langtrace.ai/supported-integrations/llm-tools/kubeai.md): Installing Langtrace in KubeAI. - [Mistral AI](https://docs.langtrace.ai/supported-integrations/llm-tools/mistral-ai.md): Mistral AI is a cutting-edge platform focused on developing advanced large language models that prioritize efficiency and performance. Known for producing compact yet powerful AI models, Mistral AI aims to democratize access to high-performance AI by making these models more accessible and adaptable… - [Ollama](https://docs.langtrace.ai/supported-integrations/llm-tools/ollama.md): Guide to integrating Ollama with Langtrace for LLM monitoring - [OpenAI](https://docs.langtrace.ai/supported-integrations/llm-tools/openai.md): OpenAI is a leading artificial intelligence research and deployment company committed to ensuring that artificial general intelligence(AGI) benefits all of humanity. It develops cutting-edge AI models, such as GPT-4, to assist in various applications, from natural language processing to complex prob… - [Perplexity](https://docs.langtrace.ai/supported-integrations/llm-tools/perplexity.md): Perplexity is an advanced AI-powered search engine that provides instant, accurate answers to complex questions by analyzing vast amounts of information. Utilizing state-of-the-art natural language processing, it delivers precise results and insightful summaries, enhancing productivity and decision-… - [xAI](https://docs.langtrace.ai/supported-integrations/llm-tools/xai.md): xAI introduces X's Grok family of models. - [Dash0](https://docs.langtrace.ai/supported-integrations/observability-tools/dash0.md): Langtrace and Dash0 Integration Guide. - [Datadog](https://docs.langtrace.ai/supported-integrations/observability-tools/datadog.md): Langtrace and Datadog Integration Guide - [Elastic APM](https://docs.langtrace.ai/supported-integrations/observability-tools/elastic.md): Langtrace and Elastic APM Integration Guide - [Grafana Cloud](https://docs.langtrace.ai/supported-integrations/observability-tools/grafana.md): Langtrace and Grafana Integration Guide - [Honeycomb](https://docs.langtrace.ai/supported-integrations/observability-tools/honeycomb.md): Langtrace and Honeycomb Integration Guide - [IBM Instana](https://docs.langtrace.ai/supported-integrations/observability-tools/ibm-instana.md): Langtrace and IBM Instana Integration Guide - [New Relic](https://docs.langtrace.ai/supported-integrations/observability-tools/newrelic.md): Langtrace and New Relic Integration Guide - [SigNoz](https://docs.langtrace.ai/supported-integrations/observability-tools/signoz.md): Langtrace and SigNoz Integration Guide. - [OTEL Configuration](https://docs.langtrace.ai/supported-integrations/otel-support/otel-configuration.md): Learn how to configure the OpenTelemetry Collector integration. - [Python SDK](https://docs.langtrace.ai/supported-integrations/otel-support/python-http-json.md): Setting up Langtrace Python SDK with OTEL Collector - [TypeScript SDK](https://docs.langtrace.ai/supported-integrations/otel-support/ts-http-json.md): Setting up Langtrace Typescript SDK with OTEL Collector - [Overview](https://docs.langtrace.ai/supported-integrations/overview.md): Langtrace supports a variety of LLMs, Frameworks and Vector databases. - [ChromaDB](https://docs.langtrace.ai/supported-integrations/vector-stores/chromadb.md): Chroma gives you the tools to store embeddings and their metadata. - [Milvus Vector Database](https://docs.langtrace.ai/supported-integrations/vector-stores/milvus.md): Milvus is a powerful open-source vector database built for scalable similarity search and AI applications. - [MongoDB Vector Search](https://docs.langtrace.ai/supported-integrations/vector-stores/mongodb.md): MongoDB is a cross-platform document-oriented database program. - [Neo4j](https://docs.langtrace.ai/supported-integrations/vector-stores/neo4j.md): Learn how to use Langtrace with Neo4j for tracing graph database operations - [Pinecone](https://docs.langtrace.ai/supported-integrations/vector-stores/pinecone.md): Pinecone is a vector database that enables fast and accurate vector search for building AI applications. It provides the infrastructure for the long-term memory and retrieval needed to develop state-of-the-art AI systems. - [Qdrant](https://docs.langtrace.ai/supported-integrations/vector-stores/qdrant.md): Langtrace can be seamlessly integrated with Qdrant, a high-performance vector database. This integration allows you to trace and monitor your vector search operations, providing valuable insights into your AI application's performance and behavior. - [Weaviate](https://docs.langtrace.ai/supported-integrations/vector-stores/weaviate.md): Weaviate is an open source vector database. - [Django](https://docs.langtrace.ai/supported-integrations/web-frameworks/django.md): Langtrace can be easily integrated with Django applications with the help of the Langtrace SDK. - [FastAPI](https://docs.langtrace.ai/supported-integrations/web-frameworks/fastapi.md): Langtrace can be easily integrated with FastAPI applications with the help of the Langtrace SDK. - [Flask](https://docs.langtrace.ai/supported-integrations/web-frameworks/flask.md): Langtrace can be easily integrated with Flask applications with the help of the Langtrace SDK. - [NextJS](https://docs.langtrace.ai/supported-integrations/web-frameworks/nextjs.md): Langtrace can be easily integrated with NextJS applications with just two steps. - [Add Additional Attributes to Traces](https://docs.langtrace.ai/tracing/additional_attributes.md): You can add additional attributes to traces in Langtrace by simply wrapping your code with the `withAdditionalAttributes` or `inject_additional_attributes` function. This will add the specified attributes to the trace spans. - [Attach Prompt IDs and Versions to Traces](https://docs.langtrace.ai/tracing/attach_prompt_id.md): Learn how to attach prompt ids and prompt versions to traces in Langtrace. You can use the `withAdditionalAttributes` or `inject_additional_attributes` function to pass prompt ids / versions. This can inturn be used within Langtrace for filtering and grouping traces. - [Attach User IDs to Traces](https://docs.langtrace.ai/tracing/attach_user_id.md): Learn how to attach user ids to traces in Langtrace. You can use the `withAdditionalAttributes` or `inject_additional_attributes` function to pass user ids. This can inturn be used within Langtrace for filtering and grouping traces. - [Set custom span names to traces](https://docs.langtrace.ai/tracing/custom_span_naming.md): Learn how to ett custom span names to traces in Langtrace. You can use the `inject_additional_attributes` function to set custom span names to traces. This name will be appended to the operation name and will be identifed as the name of the span in the trace. - [Disable Tracing For Functions](https://docs.langtrace.ai/tracing/disable_function_tracing.md): You can disable tracing for specific functions per vendor by passing the `disable_tracing_for_functionss` parameter in `Langtrace.init` function. The `object` should have the vendor name as the `key` and an `array` of **function names** as the `value`. - [Disable Tracing For Prompts & Completions](https://docs.langtrace.ai/tracing/disable_prompt_&_completions_tracing.md) - [Disable Tracing For Specific Vendors](https://docs.langtrace.ai/tracing/disable_vendor_instrumentation.md) - [Filtering Traces by Session ID](https://docs.langtrace.ai/tracing/filter_by_session_id.md): Langtrace SDK allows you to pass a session ID to traces to filter them later. - [Grouping Operations](https://docs.langtrace.ai/tracing/group_traces.md): Langtrace SDK allows you to group related operations together using the `@with_langtrace_root_span` decorator for Python or `WithLangTraceRootSpan` for Typescript. - [Chat completion](https://docs.langtrace.ai/tracing/openai/chat-completion.md) - [Tracing Overview](https://docs.langtrace.ai/tracing/overview.md) - [Sending Traces](https://docs.langtrace.ai/tracing/send_traces.md) - [Pass User Feedback](https://docs.langtrace.ai/tracing/trace_user_feedback.md): The Langtrace SDK gives you the ability to pass user feedback from your application as scores for measuring accuracy. ## Optional - [GitHub](https://github.com/Scale3-Labs/langtrace)