OpenAI Codex CLI: A Hands-On Guide for Deployment Workflows

AI and Tips & Tricks

OpenAI Codex CLI: A Hands-On Guide for Deployment Workflows

This is Part 2 of our series on AI coding assistants for developers. See also: Getting Started with Claude Code, Getting Started with Google Gemini CLI, and Comparing Claude Code, Codex, and Gemini CLI.


OpenAI's Codex CLI is a terminal-based coding agent that reads, modifies, and executes code directly on your machine. Built in Rust and fully open source, it brings OpenAI's latest reasoning models into your existing development workflow — no browser required.

Unlike the deprecated Codex API, this tool is designed for agentic coding: you describe what you want in plain English, and Codex figures out which files to change, what commands to run, and how to verify the result. Your source code stays local unless you explicitly share it — only prompts and high-level context are sent to the model.

For teams managing deployment pipelines, CI/CD configurations, and release workflows, Codex CLI offers a practical way to automate repetitive tasks while keeping full control over what gets changed.

What Is Codex CLI?

Codex CLI is an open-source command-line coding agent from OpenAI. It launches a full-screen terminal interface where you can have a conversation about your codebase, ask it to implement features, fix bugs, review code, or explain unfamiliar logic.

Here's what sets it apart from browser-based AI tools:

  • Local-first execution: Codex runs on your machine, reading and modifying files in your working directory. Your source code never leaves your environment.
  • Agentic workflow: Rather than just generating snippets, Codex plans multi-step changes across multiple files, runs commands, and verifies results.
  • Granular permissions: Three approval modes let you control exactly how much autonomy Codex has — from read-only consultation to full access.
  • Session persistence: Conversations are stored locally. You can resume previous sessions with codex resume --last or codex resume <SESSION_ID>, picking up exactly where you left off.
  • Built-in code review: The /review command analyses diffs against base branches, uncommitted changes, or specific commits — useful before deploying.

Codex CLI is powered by gpt-5.3-codex, a model specifically optimised for software engineering tasks. ChatGPT Pro subscribers also get access to GPT-5.3-Codex-Spark, a faster variant for simpler tasks.

How Does It Compare to Other AI CLI Tools?

If you're evaluating terminal-based coding assistants, here's a quick comparison:

Feature Codex CLI Claude Code Gemini CLI
Default model GPT-5.3-Codex Claude Sonnet Gemini 2.5 Pro
Auth ChatGPT plan or API key Anthropic API key Google account
Approval modes Auto / Read-only / Full Access Auto-accept or confirm Sandbox levels
MCP support Yes (STDIO + HTTP) Yes Yes
Multi-agent Yes (experimental) Via Task tool No
Code review Built-in /review Manual workflow No
Session resume Yes Yes No
Open source Yes (Rust) No Yes (TypeScript)

For a detailed breakdown, see our full comparison of Claude Code, Codex CLI, and Gemini CLI.

Installation and Setup

Installing with npm

npm install -g @openai/codex

To upgrade to the latest version:

npm install -g @openai/codex@latest

System Requirements

  • macOS and Linux: Fully supported
  • Windows: Experimental. We recommend running Codex through WSL (Windows Subsystem for Linux) for the best experience — see the next section.

Setting Up Codex CLI on Windows with WSL

Windows users can run Codex CLI reliably through WSL. Here's how:

  1. Install WSL if you haven't already:
wsl --install
  1. Open your WSL terminal (Ubuntu by default) and install Node.js:
curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash -
sudo apt-get install -y nodejs
  1. Install Codex CLI:
npm install -g @openai/codex
  1. Configure authentication (see below) and run codex from within your WSL workspace. Your Windows files are accessible under /mnt/c/, but for best performance work within the WSL filesystem (e.g., ~/projects/).

Authentication

You have two options:

ChatGPT Account (Recommended): If you have a ChatGPT Plus, Pro, Business, Edu, or Enterprise plan, Codex CLI is included at no extra cost. Run codex and follow the authentication prompts.

API Key: For headless automation or CI/CD environments:

export OPENAI_API_KEY="your-api-key-here"

To set this permanently, add it to your ~/.codex/config.toml:

preferred_auth_method = "apikey"

Getting Started

Navigate to your project directory and run:

codex

This starts an interactive session with a full-screen terminal UI. You can also pass a prompt directly for non-interactive use:

codex "Explain the structure of this project"

Approval Modes

Approval modes control how much Codex can do without asking for confirmation. You can switch modes during a session using the /permissions command.

Auto (default): Codex can read files, edit code, and run commands within your working directory. It asks permission before touching anything outside that scope or using the network. This is the right choice for most development work.

Read-only: Codex can browse your files but won't make changes or run commands until you approve. Use this when you want Codex as a consultant — for code review, architecture questions, or understanding unfamiliar code.

Full Access: Codex has unrestricted access to your machine, including network operations, without asking. Use this sparingly and only for trusted, well-defined tasks in environments you control. It's powerful for automation workflows where you've already validated what Codex will do.

Choosing Models

By default, Codex uses gpt-5.3-codex for the best balance of capability and speed. You can switch models at any time:

# Use the default GPT-5.3-Codex
codex

# Use a specific model
codex -m gpt-5

# Switch models mid-session with the /model command

ChatGPT Pro subscribers get access to GPT-5.3-Codex-Spark, a faster variant that uses less of your subscription quota — ideal for simpler tasks or when you're approaching usage limits.

Practical Examples for Deployment Workflows

Creating CI/CD Pipeline Configurations

Codex excels at generating configuration files because it can read your project structure and tailor the output:

codex "Create a GitHub Actions workflow that runs tests on push, 
       builds the application, and deploys to production on merge to main"

Pre-Deployment Code Review

Use the built-in /review command before deploying:

codex
> /review main

This analyses the diff between your current branch and main, flagging potential issues with database migrations, API changes, or deployment-breaking modifications. You can also review uncommitted changes:

codex
> /review --uncommitted

Automating Changelog Updates

Integrate Codex into your release process with the exec command for non-interactive automation:

jobs:
  update_changelog:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Update changelog via Codex
        run: |
          npm install -g @openai/codex
          export OPENAI_API_KEY="${{ secrets.OPENAI_API_KEY }}"
          codex exec "Update CHANGELOG for next release based on commits since last tag"

From Screenshot to Code

Codex supports image inputs — pass screenshots, wireframes, or diagrams alongside text prompts:

codex "Build a deployment status dashboard that looks like this screenshot" -i ./dashboard-mockup.png

This is particularly useful for creating deployment monitoring interfaces or recreating designs from visual specifications.

Git Integration for Release Management

Pipe Git output directly to Codex:

# Generate release notes from recent commits
git log --oneline v1.0.0..HEAD | codex -p "Create detailed release notes from these commits"

# Analyse changes between branches
git diff main..feature/deployment-updates | codex -p "Summarise these changes and highlight any deployment-related modifications"

Writing Deployment Scripts

codex "Write a deployment script that:
       - Puts the app in maintenance mode
       - Pulls the latest code
       - Runs bundle install
       - Runs database migrations
       - Precompiles assets
       - Restarts the application server
       - Takes the app out of maintenance mode
       - Includes error handling and rollback capabilities"

Extending Codex with MCP

Codex supports the Model Context Protocol (MCP) for connecting to external tools and services. It supports both STDIO and HTTP transports.

Add MCP server configurations to your ~/.codex/config.toml:

[mcp]
servers = [
  { name = "github", command = "npx @modelcontextprotocol/server-github" },
  { name = "filesystem", command = "npx @modelcontextprotocol/server-filesystem /path/to/project" }
]

This enables interactions like:

> @github List open pull requests that are ready for deployment
> @github Create a PR from this branch to main with a deployment checklist

Multi-Agent Workflows

One of Codex CLI's most powerful capabilities is experimental multi-agent support. You can run multiple Codex agents in parallel on the same repository, each working in isolated Git worktrees to avoid conflicts.

This is configured in config.toml with role assignments, and it integrates with the OpenAI Agents SDK for:

  • Parallel task execution: Multiple agents working on different parts of a codebase simultaneously
  • Orchestration: Running Codex as an MCP server that other tools can call
  • Auditable handoffs: Full traces of what each agent did, making it easy to review and approve changes
  • Scalability: From single-agent quick fixes to coordinated team-level efforts

This is still experimental, but it's a glimpse of where AI-assisted development is heading — especially for large-scale refactoring or migration projects.

Custom Instructions with AGENTS.md

Give Codex project-specific context using AGENTS.md files. Codex looks for these in multiple locations and merges them:

  • ~/.codex/AGENTS.md — Your personal preferences
  • Root of your repository — Shared project conventions
  • Current working directory — Feature-specific instructions

Example for a deployment-focused project:

## Deployment Context
- Ruby on Rails application deployed via DeployHQ
- Production server runs Ubuntu 22.04 with Nginx and Puma
- Database is PostgreSQL

## Deployment Rules
- Always run migrations before deploying new features
- Never deploy directly to production without staging validation
- Include rollback commands in deployment scripts

Codex Cloud and IDE Integration

Beyond the terminal, Codex offers two additional interfaces:

Codex Cloud (via ChatGPT): Assign tasks from the web or mobile app, and Codex works on them asynchronously in a sandboxed environment. Start a task on your phone during your commute and review the results later. The cloud and CLI experiences are integrated — you can continue work across both.

VS Code Extension: A VS Code extension (compatible with Cursor and Windsurf) brings the same capabilities into your editor with context from open files and selections.

Pricing

Codex CLI is included with ChatGPT Plus, Pro, Business, Edu, and Enterprise plans at no additional cost. If you're using API credits directly, pricing follows standard OpenAI API rates.

For teams already subscribed to ChatGPT, this makes Codex CLI one of the most accessible AI coding tools available — no separate billing to manage.


Want to streamline your deployment workflow? DeployHQ automates deploying code from Git to your servers, so you can focus on building. Get started free.

Have questions? Reach out at support@deployhq.com or find us on X/Twitter.