Cortex Quickstart
Cortex is a Rust AI agent framework. It gives you type-safe tool macros, multi-agent crews, and a built-in MCP server that lets Meridian call your Rust logic from the edge.
Add dependencies
# Cargo.toml
[dependencies]
cortexai-agents = "0.1"
cortexai-tools = "0.1"
cortexai-providers = "0.1"
tokio = { version = "1", features = ["full"] } Or with cargo-add:
cargo add cortexai-agents cortexai-tools cortexai-providers tokio Your first agent with a tool
use cortexai_agents::{AgentEngine, AgentConfig};
use cortexai_providers::anthropic::AnthropicProvider;
use cortexai_tools::tool;
#[tool(description = "Multiply two numbers together")]
async fn multiply(a: f64, b: f64) -> f64 {
a * b
}
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let provider = AnthropicProvider::from_env()?;
let mut agent = AgentEngine::new(AgentConfig {
model: "claude-3-5-haiku-20241022".into(),
system_prompt: "You are a math assistant.".into(),
..Default::default()
})
.with_provider(provider)
.with_tool(multiply);
let response = agent.chat("What is 17 times 43?").await?;
println!("{}", response);
Ok(())
} Run it
export ANTHROPIC_API_KEY=sk-ant-...
cargo run Output:
17 times 43 is 731. I used the multiply tool to calculate this precisely. Expose as an MCP server
Add one line to make your agent available to Meridian over the Model Context Protocol:
use cortexai_agents::mcp::McpServer;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let agent = /* ... same setup as above ... */;
McpServer::new(agent)
.bind("0.0.0.0:3100")
.serve()
.await
} Then in your Meridian agent:
tools: [
MeridianAgent.mcpTool("http://localhost:3100"),
] Meridian will automatically discover all tools your Cortex agent exposes.
Next Steps
- Add more tools — file I/O, HTTP calls, database queries
- Build a crew — run specialized agents in parallel
- Graph workflows — multi-step stateful pipelines