Installing Langtrace in KubeAI.
kind: Model
Kubernetes Custom Resources. KubeAI can be thought of as a Model Operator that manages vLLM and Ollama servers.
In this tutorial you will learn how to deploy KubeAI and Langtrace end-to-end. Both KubeAI and Langtrace are installed in your Kubernetes cluster. No cloud services or external dependencies are required.
langtrace_api_key
with your API key.
langtrace-example.py
with the following content: