Framework Integrations
Use Kalibr with CrewAI, LangChain, and OpenAI Agents SDK.
Overview
Kalibr integrates with popular agent frameworks via router.as_langchain(), which returns a LangChain-compatible LLM that uses Kalibr for routing.
CrewAI
Install
pip install kalibr crewai openai anthropic
Code (Python only)
from kalibr import Router
from crewai import Agent, Task, Crew
# Create Kalibr router
router = Router(
goal="research_task",
paths=["gpt-4o-mini", "claude-sonnet-4-20250514"]
)
# Get LangChain-compatible LLM
llm = router.as_langchain()
# Use with CrewAI
researcher = Agent(
role="Researcher",
goal="Find accurate information",
backstory="You are a research assistant.",
llm=llm, # Kalibr handles model selection
verbose=True
)
task = Task(
description="Research the history of Python programming language.",
expected_output="A summary of Python's history.",
agent=researcher
)
crew = Crew(agents=[researcher], tasks=[task])
result = crew.kickoff()
# Report outcome to Kalibr
router.report(success=len(str(result)) > 100)
Note: CrewAI requires Python 3.10+.
LangChain
Install
pip install kalibr langchain langchain-openai langchain-anthropic
Code (Python only)
from kalibr import Router
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser
router = Router(
goal="summarize_text",
paths=["gpt-4o-mini", "claude-sonnet-4-20250514"]
)
llm = router.as_langchain()
prompt = ChatPromptTemplate.from_template("Summarize this: {text}")
chain = prompt | llm | StrOutputParser()
result = chain.invoke({"text": "Long article text here..."})
router.report(success=len(result) > 50)
OpenAI Agents SDK
Install
pip install kalibr openai-agents
Code (Python only)
from kalibr import Router
router = Router(
goal="agent_task",
paths=["gpt-4o", "gpt-4o-mini"]
)
# Get routing decision
from kalibr import get_policy
policy = get_policy(goal="agent_task") # goal is required
model = policy["recommended_model"]
# Use with OpenAI Agents SDK
from openai_agents import Agent
agent = Agent(model=model)
result = agent.run("Your task here")
router.report(success=result.success)
When to Use as_langchain() vs get_policy()
| Method | Use When |
|---|---|
as_langchain() | Framework expects a LangChain LLM (CrewAI, LangChain chains) |
get_policy() | You need the model name to pass to another SDK |
router.completion() | Direct LLM calls without a framework |
Next
- API Reference - Full Router API
- Production Guide - Error handling, monitoring