Arch
Arch is an intelligent gateway designed to protect, observe, and personalize AI agents with your APIs. Engineered with purpose-built LLMs, Arch handles the critical but undifferentiated tasks related to the handling and processing of prompts, including detecting and rejecting jailbreak attempts, intelligently calling “backend” APIs to fulfill the user’s request represented in a prompt, routing to and offering disaster recovery between upstream LLMs, and managing the observability of prompts and LLM API calls in a centralized way.
Using Langtrace to monitor your Arch apps is quick and easy. Follow these steps:
Setup
-
Install and run Arch by following the steps outlined in their docs.
-
Install the Langtrace SDK.
Note: You’ll need an API key from Langtrace. Sign up for Langtrace if you haven’t done so already.
- Setup environment variables:
Usage
Generate a simple output with your deployment’s model:
You can now view your traces on the Langtrace dashboard: