Manual Tracing

Track non-LLM operations with the @trace decorator.


You will learn: - When to use manual tracing - How to use the @trace decorator - Best practices for naming


When to Use

LLM calls are auto-tracked. Use manual tracing for: - Database queries - API calls - File processing - Custom business logic


Basic Usage

from kalibr import trace

@trace(operation="fetch_documents", provider="database", model="postgres")
def fetch_documents(query):
    return db.query(query)

Parameters

Parameter Required Description
operation Yes Name of the operation
provider Yes Source system
model Yes Resource type

Examples

Database:

@trace(operation="query_users", provider="database", model="postgres")
def query_users(filter):
    return db.query(filter)

External API:

@trace(operation="fetch_weather", provider="api", model="openweathermap")
def fetch_weather(city):
    return requests.get(f"https://api.weather.com/{city}")

FastAPI route:

@app.post("/api/analyze")
@trace(operation="analyze", provider="api", model="endpoint")
async def analyze(text: str):
    return await process(text)

Combining with Auto-Instrumentation

import kalibr
from kalibr import trace
import openai

@trace(operation="research", provider="internal", model="pipeline")
def research(query):
    docs = fetch_documents(query)  # Manual span

    # Auto-instrumented span
    response = openai.chat.completions.create(...)

    return response

Dashboard shows:

research (internal/pipeline) - 3.2s
  fetch_documents (database/postgres) - 0.8s
  openai.chat.completions (openai/gpt-4o) - 2.4s

Next Steps