This is Part 2 of our series on AI coding assistants for developers. See also: Getting Started with Claude Code, Getting Started with Google Gemini CLI, and Comparing Claude Code, Codex, and Gemini CLI.
OpenAI has long been associated with AI-powered code generation, and their latest offering—Codex CLI—brings that power directly to your terminal. Unlike the original Codex API (which has been deprecated), this new command-line tool is designed specifically for agentic coding workflows, allowing developers to read, modify, and execute code locally while leveraging OpenAI's latest reasoning models.
For developers working with deployment automation, Codex CLI offers a compelling combination of powerful AI capabilities and seamless integration with existing workflows. Let's explore how to set it up and put it to work.
What is Codex CLI?
Codex CLI is an open-source command-line coding agent from OpenAI. It runs locally on your machine, understanding your codebase and helping you build features, fix bugs, and understand unfamiliar code—all through natural language commands.
A key distinction from browser-based tools: your source code never leaves your environment unless you explicitly share it. Only your prompts, high-level context, and optional diff summaries are sent to the model for generation.
Codex CLI is powered by GPT-5-Codex, a model specifically optimised for software engineering tasks. It's equally proficient at quick, interactive sessions and at independently working through complex, multi-file tasks.
Installation and Setup
Codex CLI installation is straightforward with npm or Homebrew.
Installing with npm
npm install -g @openai/codex
Installing with Homebrew (macOS)
brew install --cask codex
Upgrading
To update to the latest version:
codex --upgrade
System Requirements
Codex CLI officially supports macOS and Linux. Windows support is experimental—if you're on Windows, we recommend running it through WSL (Windows Subsystem for Linux).
Authentication
You have two authentication options:
ChatGPT Account (Recommended): If you have a ChatGPT Plus, Pro, Business, Edu, or Enterprise plan, Codex CLI is included. Simply run codex and follow the authentication prompts.
API Key: For headless automation or if you prefer using API credits:
export OPENAI_API_KEY="your-api-key-here"
To configure API key authentication permanently, add this to your ~/.codex/config.toml:
preferred_auth_method = "apikey"
Getting Started
Navigate to your project directory and run:
codex
This starts an interactive session. You can also pass a prompt directly:
codex "Explain the structure of this project"
Approval Modes
Codex CLI offers three distinct approval modes that let you control how hands-on you want to be:
Suggest Mode (
--suggest): Codex shows proposed changes but requires your explicit approval before making any modifications. Best for critical code or when you're learning the tool.Auto-Edit Mode (
--auto-edit): Codex can automatically edit files in your workspace but still asks permission for commands outside your project directory.Full Auto Mode (
--full-auto): Codex has full access to read files anywhere and run commands with network access. Use this for trusted, well-defined tasks.
You can switch modes during a session using the /mode slash command.
Choosing Models
By default, Codex uses GPT-5-Codex for the best balance of capability and speed. You can specify different models:
# Use the default GPT-5-Codex
codex
# Use GPT-5-Codex-Mini for faster, more cost-effective responses
codex --model gpt-5-codex-mini
# Use a specific model
codex -m gpt-5
The mini version provides approximately 4x more usage within your subscription limits, making it ideal for simpler tasks or when you're approaching usage limits.
Practical Examples for Deployment Workflows
Let's explore how Codex CLI can streamline deployment-related tasks.
Creating CI/CD Pipeline Configurations
Codex CLI excels at generating configuration files:
codex "Create a GitHub Actions workflow that runs tests on push,
builds the application, and deploys to production on merge to main"
Automating Changelog Updates
Integrate Codex into your release process. Here's an example GitHub Actions job:
jobs:
update_changelog:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Update changelog via Codex
run: |
npm install -g @openai/codex
export OPENAI_API_KEY="${{ secrets.OPENAI_API_KEY }}"
codex exec --full-auto "Update CHANGELOG for next release"
Multimodal Input: From Screenshot to Code
One of Codex CLI's standout features is its support for multimodal input. You can pass screenshots, wireframes, or diagrams alongside text prompts:
codex "Build a deployment status dashboard that looks like this screenshot" --image ./dashboard-mockup.png
This is particularly powerful for creating deployment monitoring interfaces or recreating designs from visual specifications.
Git Integration for Release Management
Pipe Git output directly to Codex for automated documentation:
# Generate release notes from recent commits
git log --oneline v1.0.0..HEAD | codex -p "Create detailed release notes from these commits"
# Analyse changes between branches
git diff main..feature/deployment-updates | codex -p "Summarise these changes and highlight any deployment-related modifications"
Pre-deployment Code Reviews
Before deploying, have Codex review your changes:
codex "Review the changes in this branch. Flag any potential issues
that could cause problems in production, especially around
database migrations and API changes"
Writing Deployment Scripts
For Ruby/Rails projects common in the DeployHQ user base:
codex "Write a deployment script that:
- Puts the app in maintenance mode
- Pulls the latest code
- Runs bundle install
- Runs database migrations
- Precompiles assets
- Restarts the application server
- Takes the app out of maintenance mode
- Includes error handling and rollback capabilities"
Extending Codex with MCP
Like other modern AI CLI tools, Codex supports the Model Context Protocol (MCP) for connecting to external tools and services.
Configuring MCP Servers
Add MCP server configurations to your ~/.codex/config.toml:
[mcp]
servers = [
{ name = "github", command = "npx @modelcontextprotocol/server-github" },
{ name = "filesystem", command = "npx @modelcontextprotocol/server-filesystem /path/to/project" }
]
This enables interactions like:
> @github List open pull requests that are ready for deployment
> @github Create a PR from this branch to main with a deployment checklist
Custom Instructions with AGENTS.md
You can give Codex project-specific context using AGENTS.md files. Codex looks for these files in multiple locations and merges them:
~/.codex/AGENTS.md- Your personal preferences- Root of your repository - Shared project notes
- Current working directory - Feature-specific instructions
Example AGENTS.md for a deployment-focused project:
# Project Instructions
## Deployment Context
- This is a Ruby on Rails application deployed via DeployHQ
- Production server runs Ubuntu 22.04 with Nginx and Puma
- Database is PostgreSQL
## Deployment Rules
- Always run migrations before deploying new features
- Never deploy directly to production without staging validation
- Include rollback commands in deployment scripts
## Coding Standards
- Follow the existing code style
- Add tests for any deployment-related scripts
- Document any changes to deployment configuration
Codex Cloud Integration
Beyond the CLI, Codex also offers a cloud-based experience through ChatGPT. You can assign tasks from the web interface or mobile app, and Codex will work on them asynchronously in a sandboxed environment.
This is particularly useful for:
- Background Tasks: Kick off refactoring or documentation work while you focus on other things
- Mobile Access: Assign tasks from your phone during commute or travel
- Parallel Work: Run multiple agents on different tasks simultaneously
The cloud and CLI experiences are integrated—you can start work in one and continue in the other without losing context.
IDE Integration
Codex also offers a VS Code extension (compatible with Cursor and Windsurf) that brings the same capabilities into your editor. When working in the IDE, Codex can use context like open files and selected code to provide more targeted assistance.
Best Practices
Create Git Checkpoints: Before running Codex in auto modes, create a commit so you can easily revert if needed.
Use Approval Modes Appropriately: Start with --suggest for critical deployments, use --full-auto for well-understood tasks in development environments.
Leverage AGENTS.md: Document your project's deployment patterns and constraints so Codex understands your specific requirements.
Review Generated CI/CD Configs: While Codex is excellent at generating pipeline configurations, always review them before committing, especially for production deployments.
Pricing
Codex CLI is included with ChatGPT Plus, Pro, Business, Edu, and Enterprise plans. If you're using API credits directly, GPT-5-Codex pricing follows standard OpenAI API rates.
For teams already subscribed to ChatGPT, Codex CLI represents excellent value as it's included in your existing subscription.
What's Next
Codex CLI represents OpenAI's vision of AI-assisted development: a tool that meets you where you work and adapts to your workflow. Its combination of powerful reasoning models, multimodal input, and flexible approval modes makes it a compelling choice for deployment automation.
In the next post, we'll explore Google's Gemini CLI and see how it approaches similar problems with its own unique features. Then, we'll wrap up the series with a comprehensive comparison to help you choose the right tool for your team.
Looking to automate your deployment pipeline? DeployHQ handles the complexity of deploying code from Git to your servers, giving you more time to focus on writing great code.