Anthropic is an AI research organization behind Claude model family. Whether you’re brainstorming alone or building with a team of thousands, Claude is here to help.
Using Langtrace to monitor your Claude backed LLM apps is quick and easy. Follow these steps:
Generate a simple output with your deployment’s model:
Copy
# Imports import osimport osfrom langtrace_python_sdk import langtrace # Must precede any llm module importslangtrace.init(api_key = os.environ['LANGTRACE_API_KEY'])from anthropic import Anthropicclient = Anthropic(# This is the default and can be omittedapi_key=os.environ.get("ANTHROPIC_API_KEY"),)# Query Anthropic's claude-3-opus MODEL:message = client.messages.create(max_tokens=1024,messages=[ { "role": "user", "content": "Hello, Claude", }],model="claude-3-opus-20240229",)print(message)
You can now view your traces on the Langtrace dashboardWant to see more supported methods? Checkout the sample code in the Langtrace Anthropic Python Example repository.