Last updated on 16th April 2026

AI Assistant and Agent Integration

The DeployHQ CLI includes a local AI assistant for deployment help and integrations for AI coding agents like Claude Code and OpenAI Codex.

AI Assistant (dhq assist)

Get AI-powered help for your deployments using a local LLM. All data stays on your machine -- nothing is sent to external AI services.

If you are already using an AI coding agent like Claude Code, Codex, or Cursor, your agent can interact with DeployHQ directly through the CLI and API -- you do not need dhq assist. The local assistant is best suited for developers who want a privacy-first, offline-capable option without relying on an external coding agent. It uses an open-source model via Ollama and keeps all data on your machine.

Setup

The assistant requires Ollama running locally. Run the one-time setup:

dhq assist --setup

This installs Ollama (if needed) and downloads the default model (qwen2.5:3b, approximately 2GB).

Asking Questions

# Ask about a deployment
dhq assist "why did my deploy fail?" -p my-app

# Get suggestions
dhq assist "what should I do?" -p my-app

# Ask about DeployHQ concepts
dhq assist "what does transfer_files do?"

The assistant automatically gathers context from your project (recent deployments, server configuration, logs) to provide relevant answers.

Interactive Mode

For a conversational experience:

dhq assist --interactive -p my-app

Checking Status

Verify that Ollama is running and the model is available:

dhq assist --status

Using a Different Model

dhq assist "why did my deploy fail?" --model llama3.2

Agent Plugins

The CLI can install integration plugins that help AI coding agents interact with DeployHQ. Plugins are available for Claude Code, OpenAI Codex, Cursor, and Windsurf.

Claude Code Integration

dhq setup claude

This creates a .claude/SKILL.md file and command reference in your project, enabling Claude Code to discover and use DeployHQ CLI commands.

To install at the project level only:

dhq setup claude --project

To remove the integration:

dhq setup claude --uninstall

OpenAI Codex Integration

dhq setup codex

This creates a .codex/ directory with DeployHQ configuration for Codex.

To remove the integration:

dhq setup codex --uninstall

Cursor Integration

dhq setup cursor

This creates a .cursor/SKILL.md file with DeployHQ configuration for Cursor.

To remove the integration:

dhq setup cursor --uninstall

Windsurf Integration

dhq setup windsurf

This creates a .windsurf/SKILL.md file with DeployHQ configuration for Windsurf.

To remove the integration:

dhq setup windsurf --uninstall

Command Catalog

The CLI exposes a full command catalog as JSON, designed for AI agent discovery:

dhq commands --json

This outputs every command, subcommand, flag, and description in a structured format that agents can parse to understand available operations.

Agent-Optimized Help

dhq --help --agent

This outputs help in JSON format optimized for machine consumption.

Agent Identification

When running the CLI from an automation agent, set the DEPLOYHQ_AGENT environment variable to identify the agent:

DEPLOYHQ_AGENT=my-bot dhq deploy -p my-app --json

The CLI also automatically detects common agent environments: - DEPLOYHQ_AGENT environment variable - CLAUDE_CODE or CLAUDECODE environment variable (Claude Code) - CURSOR_AGENT environment variable (Cursor) - WINDSURF_AGENT environment variable (Windsurf) - Standard CI environment variables (GitHub Actions, GitLab CI, etc.)

MCP Server

The CLI includes a built-in MCP (Model Context Protocol) server that enables AI assistants like Claude Desktop to interact with DeployHQ:

dhq mcp

This starts the MCP server in stdio mode. For detailed setup and usage instructions, see the MCP Server documentation.

Showing Resources by URL

Resolve a DeployHQ URL to structured data:

dhq show https://mycompany.deployhq.com/projects/my-app

This is useful for AI agents that encounter DeployHQ URLs in conversations or documents.