Open-source from Winfunc Research
Build long-horizon agents
in Rust.
Multi-provider LLM support, typed tools, real-time streaming, and session persistence — in one coherent crate.

use appam::prelude::*;async fn main() -> Result<()> { let agent = Agent::quick( "anthropic/claude-sonnet-4-5", "You are a helpful assistant.", vec![], )?; agent .stream("Plan a release checklist") .on_content(|text| print!("{}", text)) .run() .await?; Ok(())}Capabilities
Everything agents need to run in production.
Eight Providers, One API
Anthropic, OpenAI, Vertex, OpenRouter, Azure, Bedrock, Codex — swap with a single line change. No vendor lock-in, no abstraction tax.
Typed Tool System
Define tools as Rust structs with #[tool], closures, or TOML declarations. Full type safety at compile time, zero boilerplate at runtime.
Streaming by Default
Real-time events to console, channels, callbacks, or custom consumers. Token-by-token control over every response as it arrives.
Session Persistence
Conversations survive restarts via SQLite. Resume, query, and inspect any session long after the agent has finished running.
Built-in Tracing
JSONL traces, structured stream events, and SQLite-backed history give you full observability without bolting on external tooling.
Production Reliability
Retries with exponential backoff, continuation mechanics, rate limiting, and provider-specific tuning. Built for jobs that cannot fail silently.
Provider Support
One crate. Eight providers. Zero friction.
Switch providers with a one-line model string change. The streaming API, tool system, and session management remain identical across all of them.
Documentation
Navigate the knowledge graph.
Explore the full documentation as an interactive graph. Click any node to jump straight to that page.