Skip to main content

clawdesk-providers

Integrates with AI model providers (Anthropic, OpenAI, Google Gemini, AWS Bedrock, Ollama). Each provider implements a common Provider trait. The crate also includes the capability negotiator for matching models to requirements and a provider registry for dynamic selection.

Dependencies

Internal: clawdesk-types

External: reqwest, async-trait, serde, tokio, tracing, futures

Modules

ModuleDescription
anthropicAnthropic Claude API (Messages API)
bedrockAWS Bedrock runtime integration
capabilityCapability, CapabilitySet for model feature matching
geminiGoogle Gemini API integration
negotiatorMatches model capabilities to request requirements
ollamaOllama local inference API
openaiOpenAI Chat Completions API
registryProviderRegistry — dynamic provider management

Key Types & Traits

/// Core provider interface
#[async_trait]
pub trait Provider: Send + Sync {
fn name(&self) -> &str;
fn models(&self) -> Vec<ModelId>;
async fn chat(
&self,
model: &ModelId,
messages: &[Message],
options: &ChatOptions,
) -> Result<Message, ProviderError>;
async fn stream(
&self,
model: &ModelId,
messages: &[Message],
options: &ChatOptions,
) -> Result<StreamResponse, ProviderError>;
}

/// Model capabilities
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub enum Capability {
Chat,
Streaming,
FunctionCalling,
JsonMode,
Vision,
CodeExecution,
}

/// Provider registry for dynamic lookup
pub struct ProviderRegistry {
providers: HashMap<String, Box<dyn Provider>>,
}

/// Chat request options
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ChatOptions {
pub temperature: Option<f32>,
pub max_tokens: Option<usize>,
pub top_p: Option<f32>,
pub stop: Option<Vec<String>>,
pub tools: Option<Vec<ToolDefinition>>,
}

Provider Matrix

ProviderChatStreamToolsVisionLocal
Anthropic
OpenAI
Gemini
Bedrock
Ollama

Example Usage

use clawdesk_providers::{ProviderRegistry, ChatOptions};
use clawdesk_types::{Message, ModelId};

let registry = ProviderRegistry::new();

// Get a specific provider
let anthropic = registry.get("anthropic").unwrap();

// Chat with a model
let response = anthropic
.chat(
&ModelId::from("claude-sonnet-4-20250514"),
&[Message::user("Hello!")],
&ChatOptions::default(),
)
.await?;

// Use the negotiator to find a capable model
let model = registry.negotiate(&[Capability::Chat, Capability::FunctionCalling])?;
info

See Adding a Provider for a guide on implementing new provider integrations.