API Reference
LlmProvider
Provider enum used by AppConfig, AgentBuilder, and DynamicLlmClient.
LlmProvider selects the backend client Appam will instantiate at runtime.
Definition
pub enum LlmProvider {
OpenRouterCompletions,
OpenRouterResponses,
Anthropic,
OpenAI,
OpenAICodex,
Vertex,
AzureOpenAI { resource_name: String, api_version: String },
AzureAnthropic { base_url: String, auth_method: AzureAnthropicAuthMethod },
Bedrock { region: String, model_id: String, auth_method: BedrockAuthMethod },
}Default is OpenRouterCompletions.
Parsing and display
FromStr accepts:
openrouter,openrouter-completions,openroutercompletionsopenrouter-responses,openrouterresponsesanthropicopenaiopenai-codex,openai_codex,codexvertex,google-vertex,google_vertexazure-openai,azure_openai,azureazure-anthropic,azure_anthropicbedrock,aws-bedrock,aws_bedrock
When parsing azure* or bedrock, extra fields are filled from environment variables inside from_str(...).
Display returns:
openrouter-completionsopenrouter-responsesanthropicopenaiopenai-codexvertexazure-openai[resource@api-version]azure-anthropic[base-url:auth]bedrock[model_id@region:auth]
Pricing normalization
pricing_key() intentionally collapses variants:
| Variant | pricing_key() |
|---|---|
Anthropic, AzureAnthropic, Bedrock | anthropic |
OpenAI, OpenAICodex, AzureOpenAI | openai |
OpenRouterCompletions, OpenRouterResponses | openrouter |
Vertex | vertex |
Example
use appam::prelude::*;
let agent = AgentBuilder::new("assistant")
.provider(LlmProvider::OpenAI)
.model("gpt-5.4")
.system_prompt("You are helpful.")
.build()?;