Getting Started
Installation
Section titled “Installation”pip install cyclops-aiOr with uv (recommended):
uv add cyclops-aiPython 3.10 or later is required.
Set your API key
Section titled “Set your API key”Cyclops uses LiteLLM under the hood, so credentials are just environment variables. Set the one for the provider you want.
# Groq: free tier, fast (good for experimenting)export GROQ_API_KEY="gsk_..."
# OpenAIexport OPENAI_API_KEY="sk-..."
# Anthropicexport ANTHROPIC_API_KEY="sk-ant-..."
# Together AIexport TOGETHERAI_API_KEY="..."
# Ollama: no key needed; install from https://ollama.aiYou can put these in a .env file and load them with python-dotenv.
First agent
Section titled “First agent”Three lines to get a response:
from cyclops import Agent, AgentConfig
agent = Agent(AgentConfig(model="groq/llama-3.1-8b-instant"))print(agent.run("What is the capital of France?"))The agent keeps conversation history automatically. The next call sees the full prior context:
agent.run("What is the capital of France?")agent.run("And what is its population?") # knows "it" means ParisFirst agent with tools
Section titled “First agent with tools”Decorate any function with @tool. The function name becomes the tool name; the docstring becomes the description; type annotations become the JSON schema.
from cyclops import Agent, AgentConfigfrom cyclops.toolkit import toolfrom datetime import datetime
@tooldef get_time() -> str: """Return the current time as HH:MM:SS.""" return datetime.now().strftime("%H:%M:%S")
@tooldef add(a: float, b: float) -> float: """Add two numbers and return the result.""" return a + b
agent = Agent(AgentConfig(model="groq/llama-3.1-8b-instant"), tools=[get_time, add])
print(agent.run("What time is it right now?"))print(agent.run("What is 42 plus 58?"))Async usage
Section titled “Async usage”Every sync method has an async counterpart. Use arun, astream, and arun_with_response inside any async context.
import asynciofrom cyclops import Agent, AgentConfig
async def main(): agent = Agent(AgentConfig(model="groq/llama-3.1-8b-instant"))
# Sequential calls share history. await agent.arun("Name three planets.") await agent.arun("Which is largest?")
# Run two independent agents concurrently. a = Agent(AgentConfig(model="groq/llama-3.1-8b-instant")) b = Agent(AgentConfig(model="groq/llama-3.1-8b-instant")) results = await asyncio.gather( a.arun("Capital of Japan?"), b.arun("Capital of Brazil?"), ) print(results)
asyncio.run(main())Switch providers
Section titled “Switch providers”Change the model string. Nothing else changes.
# Ollama (local, no API key)Agent(AgentConfig(model="ollama/qwen3:4b"))
# OpenAIAgent(AgentConfig(model="gpt-4o-mini"))
# AnthropicAgent(AgentConfig(model="claude-3-5-haiku-20241022"))
# Together AIAgent(AgentConfig(model="together_ai/meta-llama/Llama-3-8b-chat-hf"))Ollama runs entirely on your machine. Pull a model with ollama pull qwen3:4b and use "ollama/<model-name>" with no API key.