LLM Frameworks
LiteLLM
Langtrace and LiteLLM Integration Guide
Langtrace integrates directly with LiteLLM, offering detailed, real-time insights into performance metrics such as cost, token usage, accuracy, and latency.
You’ll need API key from Langtrace. Sign up for Langtrace if you haven’t done so already.*
LiteLLM SDK
- Setup environment variables:
Shell
- Add callback to your LiteLLM client
main.py
- Use LiteLLM completion
main.py
LiteLLM Proxy
- Create
config.yaml
:
config.yaml
- Run LiteLLM Proxy
Shell
- Test your setup
You can now view your traces on the Langtrace dashboard
Want to see more supported methods? Checkout the sample code in the Langtrace Langchain Python Example