Skip to main content
Skip to main content

AI Integration

argsh scripts are AI-native by design. The same :usage and :args declarations that power help text and shell completion also generate machine-readable tool definitions for LLMs. No glue code, no manual schema maintenance.

Two modes of operation

ModeCommandWhat it does
Static schemasdocgen llm claudeJSON tool arrays you feed into API calls
Live MCP servermcpStdio JSON-RPC server that AI agents connect to directly

Both derive their tool definitions from the same source — your script's :usage and :args declarations are the single source of truth.

Static schemas (docgen llm)

Generate ready-to-use tool definitions for Anthropic, OpenAI, Gemini, or Kimi. Pipe the output into your agent framework or commit it to your repo for CI.

./myapp docgen llm claude > tools.json

Live MCP server (mcp)

Turn your script into a Model Context Protocol server. Claude Code, Cursor, Claude Desktop, and other MCP clients can discover and invoke your commands directly.

./myapp mcp

No Python, no Node.js wrappers — one command and your CLI is a live tool server.

Was this section helpful?