a2

An open-source SDK for building AI agents.

a2 is an open-source SDK for building AI agents. It handles sandboxed execution, persistent storage, streaming, and tool orchestration. Durability via Vercel Workflow is opt-in. Built on the Vercel AI SDK.

src/agent.ts
import { agent } from "experimental-agent";

export const myAgent = agent("my-agent", {
  model: "anthropic/claude-opus-4.6",
  system: "You are a helpful coding assistant.",
  needsApproval: {
    Bash: true,
  },
});

Send and Stream

src/chat.ts
import { createUIMessageStreamResponse } from "ai";
import { myAgent } from "@/agent";

export async function sendMessage({
  chatId,
  message,
}: {
  chatId: string;
  message: string;
}) {
  const session = myAgent.session(chatId);
  await session.send(message);
  const stream = await session.stream();
  return createUIMessageStreamResponse({ stream });
}

a2 works with any JavaScript runtime. See Framework Guides for runtime-specific setup, including Next.js.

Built-in Tools

Every agent includes 8 tools that execute inside the sandbox:

  • Read — Read file contents with line-range pagination
  • Write — Write content to a file, creating parent directories
  • Edit — Replace an exact string in a file
  • Grep — Search files with ripgrep (regex, globs, file type filters)
  • List — Recursively list directory contents with depth control
  • Bash — Execute shell commands with streaming output
  • Skill — Load reusable instruction sets on demand
  • JavaScript — Execute JavaScript code dynamically, with access to other tools as sub-tools

Define additional tools using tool() from the Vercel AI SDK. Use activeTools to control which tools are enabled per-agent or per-session, and needsApproval to require human approval before any tool executes — both built-in and custom.

What a2 Handles

  • Sandbox — Agents run tools in an isolated execution environment. Supports Vercel Sandbox, local filesystem, Docker, or a custom backend.
  • Storage — Sessions, messages, and sandbox state are persisted via a storage() function you provide. Built-in localStorage() for dev, or bring your own database.
  • Streaming — Real-time message streaming with reconnection support. Compatible with the AI SDK's useChat and createUIMessageStreamResponse.
  • Workflow (Opt-In) — Add "use workflow" for durable execution via Vercel Workflows. Survives serverless timeouts, can suspend for human approval, and resumes where it left off.
  • Approvals — Require human approval before specific tools execute. The agent suspends until the user responds.
  • Hooks — Intercept tool execution with tool.before and tool.after hooks to modify inputs, transform outputs, or block calls.
  • Skills — Reusable instruction sets loaded from the filesystem that the agent reads on demand.

Defaults

FeatureDefaultOverride
StoragelocalStorage() (filesystem)Custom storage() function
SandboxVercel Sandboxsandbox: { type: "local" } or { type: "docker" }
Model— (required)Any model via Vercel AI Gateway
Tools8 built-in toolsAdd custom tools, restrict with activeTools
ApprovalsNone requiredneedsApproval: { Bash: true }
WorkflowOff (in-process)Add a "use workflow" function

Resources

  • Installation — Install the SDK and its dependencies.
  • Quickstart — Set up an agent step by step.
  • Build Your First Agent — End-to-end guide with sandbox, tools, approvals, and deployment.
  • Concepts — How sessions, tools, sandbox, storage, and streaming work.
  • Changelog — What changed in the latest release.
  • Use with AI SDKuseChat, streaming, typed messages, and model selection.
  • API Reference — Full agent() and session API documentation.
  • GitHub — Source code.

On this page

GitHubEdit this page on GitHub