Content
# RSADK - Rust Agent Development Kit
A feature-complete Rust-based AI agent framework inspired by Google's ADK, implementing the A2A (Agent-to-Agent) protocol for interoperability.
## Features
- **Multiple LLM Providers**: OpenAI, Anthropic Claude, and Google Gemini with a unified interface
- **Tool System**: Extensible tool framework with built-in calculator, datetime, and HTTP fetch tools
- **Agentic Loop**: Automatic tool execution until task completion
- **Session Management**: Persistent conversation state and memory
- **Event-Driven Architecture**: Real-time event streaming for observability
- **A2A Protocol Support**: Full JSON-RPC server for agent-to-agent communication
- **CLI Interface**: Interactive REPL, single query, and server modes
## Design Philosophy
RSADK implements the principles outlined in Google Cloud's [Four Steps for Building Multi-Agent Systems](https://cloud.google.com/blog/topics/startups/four-steps-for-startups-to-build-multi-agent-systems):
| Principle | RSADK Implementation |
|-----------|---------------------|
| **Foundation (ADK)** | Rust-native ADK with `LlmAgentBuilder` for composable agents |
| **Hybrid Architecture** | Modular `Runner` orchestration with pluggable LLM providers |
| **Tool-Based Execution** | Extensible `Tool` trait with `ToolRegistry` for state isolation |
| **Agent Interoperability** | Full A2A Protocol support for multi-agent communication |
## Installation
### Prerequisites
- Rust 1.85+ (edition 2024)
- An API key from OpenAI, Anthropic, or Google
### Build from Source
```bash
git clone https://github.com/yourusername/rsadk
cd rsadk
cargo build --release
```
The binary will be at `./target/release/rsadk`.
## Configuration
Set your LLM provider API key as an environment variable:
```bash
# OpenAI
export OPENAI_API_KEY=sk-...
# Anthropic
export ANTHROPIC_API_KEY=sk-ant-...
# Google Gemini
export GOOGLE_API_KEY=...
```
Or create a `.env` file in your working directory:
```env
OPENAI_API_KEY=sk-...
```
## Usage
### Interactive REPL
Start an interactive chat session:
```bash
rsadk run
```
With a specific provider:
```bash
rsadk -p anthropic run
rsadk -p gemini run
```
With a custom system prompt:
```bash
rsadk run -s "You are a helpful coding assistant"
```
### Single Query
Ask a single question and get a response:
```bash
rsadk ask "What is the capital of France?"
rsadk ask "Calculate 15% of 850"
```
### A2A Protocol Server
Start the agent as an A2A-compatible server:
```bash
rsadk serve --port 8080
```
Options:
- `--host`: Host to bind (default: 127.0.0.1)
- `--port`: Port to bind (default: 8080)
- `--name`: Agent name (default: rsadk-agent)
- `-s, --system-prompt`: Custom system prompt
The server exposes:
- `POST /a2a` - JSON-RPC endpoint
- `GET /.well-known/agent.json` - Agent card
- `GET /health` - Health check
### Show Information
```bash
rsadk info
```
## Built-in Tools
| Tool | Description |
|------|-------------|
| `calculator` | Evaluate mathematical expressions (e.g., `2 + 3 * 4`) |
| `datetime` | Get current date/time with timezone support |
| `http_fetch` | Fetch content from URLs |
## Library Usage
Use RSADK as a library in your Rust project:
```toml
[dependencies]
rsadk = { path = "../rsadk" }
tokio = { version = "1", features = ["full"] }
```
### Basic Example
```rust
use rsadk::prelude::*;
use std::sync::Arc;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
// Create an agent with tools
let agent = LlmAgentBuilder::new("assistant")
.description("A helpful AI assistant")
.system_prompt("You are a helpful assistant.")
.with_builtin_tools()
.build();
// Create a runner with OpenAI
let config = RunnerConfig::openai();
let runner = Runner::new(Arc::new(agent), config)?;
// Run the agent
let events = runner.run("What is 15 * 7?").await?;
for event in events {
if event.is_final_response() {
println!("{}", event.content.as_text().unwrap_or_default());
}
}
Ok(())
}
```
### Custom Tool
```rust
use rsadk::prelude::*;
use std::sync::Arc;
let weather_tool = FunctionToolBuilder::new("get_weather")
.description("Get the current weather for a city")
.string_param("city", "The city name", true)
.build_sync(|args| {
let city = args["city"].as_str().unwrap_or("Unknown");
Ok(serde_json::json!({
"city": city,
"temperature": "72°F",
"condition": "Sunny"
}))
});
let agent = LlmAgentBuilder::new("weather_agent")
.system_prompt("You help users check the weather.")
.tool(Arc::new(weather_tool))
.build();
```
### A2A Server Integration
```rust
use rsadk::prelude::*;
use std::sync::Arc;
let agent = LlmAgentBuilder::new("my-agent")
.with_builtin_tools()
.build();
let agent_card = AgentCard::new("my-agent", "http://localhost:8080")
.with_description("My custom AI agent");
let handlers = A2aHandlers::new(
Arc::new(agent),
agent_card,
session_service,
tools,
llm_client,
llm_config,
);
let server = A2aServer::new(handlers, "127.0.0.1:8080".parse()?);
server.serve().await?;
```
## Architecture
```
┌─────────────────────────────────────────────────────────────┐
│ CLI / API │
├─────────────────────────────────────────────────────────────┤
│ Runner │
├─────────────┬─────────────┬─────────────┬───────────────────┤
│ Agent │ Session │ Tools │ Events │
│ (LlmAgent) │ (State) │ (Registry) │ (Stream) │
├─────────────┴─────────────┴─────────────┴───────────────────┤
│ LLM Providers │
│ (OpenAI / Anthropic / Gemini) │
├─────────────────────────────────────────────────────────────┤
│ A2A Protocol │
│ (JSON-RPC Server) │
└─────────────────────────────────────────────────────────────┘
```
## A2A Protocol
RSADK implements the [A2A Protocol](https://a2a-protocol.org/) for agent interoperability:
### Supported Methods
| Method | Description |
|--------|-------------|
| `a2a.getAgentCard` | Get agent metadata and capabilities |
| `a2a.sendMessage` | Send a message to the agent |
| `a2a.getTask` | Get task status and history |
| `a2a.listTasks` | List tasks with filtering |
| `a2a.cancelTask` | Cancel a running task |
### Example Request
```bash
curl -X POST http://localhost:8080/a2a \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"method": "a2a.sendMessage",
"params": {
"message": {
"role": "user",
"parts": [{"type": "text", "text": "Hello!"}]
}
},
"id": 1
}'
```
## Project Structure
```
rsadk/
├── Cargo.toml
├── src/
│ ├── lib.rs # Public API
│ ├── main.rs # CLI
│ ├── agent/
│ │ ├── mod.rs # Agent trait
│ │ ├── context.rs # Execution context
│ │ └── llm_agent.rs # LLM-powered agent
│ ├── tool/
│ │ ├── mod.rs # Tool trait & registry
│ │ ├── function.rs # Function tool wrapper
│ │ └── builtin.rs # Built-in tools
│ ├── session/
│ │ ├── mod.rs # Session types
│ │ └── service.rs # Session service
│ ├── event/
│ │ ├── mod.rs # Event types
│ │ └── stream.rs # Event streaming
│ ├── llm/
│ │ ├── mod.rs # LLM client trait
│ │ ├── openai.rs # OpenAI provider
│ │ ├── anthropic.rs # Anthropic provider
│ │ └── gemini.rs # Gemini provider
│ ├── a2a/
│ │ ├── mod.rs # A2A exports
│ │ ├── types.rs # Protocol types
│ │ ├── handlers.rs # Request handlers
│ │ └── server.rs # HTTP server
│ └── runner/
│ ├── mod.rs # Runner
│ └── cli.rs # CLI runner
```
## License
MIT