OpenAI Codex CLI Complete Installation Guide: From npm Setup to Your First AI-Generated Code

OpenAI Codex CLI Complete Installation Guide

OpenAI Codex CLI is an open-source command-line tool that brings the power of AI-assisted coding directly to your terminal. It interprets natural language prompts, reads your codebase, proposes changes, and can even execute commands — all within a configurable sandbox environment for safety. This guide walks you through every step from installation to generating your first code.

Prerequisites

  • Node.js 22 or higher — Codex CLI requires a modern Node.js runtime- An OpenAI API key — with access to models like o4-mini or o3- Git — recommended for version-controlled projects- Operating System: macOS or Linux (Windows users should use WSL2)Verify your Node.js version before proceeding: node —version

Must output v22.0.0 or higher

Step 1: Install OpenAI Codex CLI via npm

Install the Codex CLI globally using npm: npm install -g @openai/codex

Verify the installation was successful: codex --version

If you encounter permission errors on macOS or Linux, avoid using sudo. Instead, configure npm's global directory: mkdir -p ~/.npm-global npm config set prefix '~/.npm-global' export PATH=~/.npm-global/bin:$PATH

Add the export line to your ~/.bashrc or ~/.zshrc for persistence.

Step 2: Configure Your OpenAI API Key

Codex CLI authenticates via the OPENAI_API_KEY environment variable. Set it in your shell profile: # Add to ~/.bashrc, ~/.zshrc, or ~/.profile export OPENAI_API_KEY=“YOUR_API_KEY”

Reload your shell configuration: source ~/.bashrc

Alternatively, create a .env file in your project root: echo ‘OPENAI_API_KEY=YOUR_API_KEY’ > .env

Codex CLI will automatically detect the .env file when run from that directory.

Verify API Key Configuration

codex “Say hello”

Should return a response without authentication errors

Step 3: Understand and Configure Sandbox Security Modes

One of Codex CLI's most important features is its approval policy system, which controls how much autonomy the AI agent has. There are three modes:

ModeFlagFile EditsCommand ExecutionBest For
**Suggest**--approval-mode suggestRequires approvalRequires approvalMaximum safety, reviewing each change
**Auto Edit**--approval-mode auto-editAuto-appliedRequires approvalRapid prototyping with safe commands
**Full Auto**--approval-mode full-autoAuto-appliedAuto-executed in sandboxAutomated pipelines, CI/CD tasks
Start with **Suggest** mode (the default) until you are comfortable with the tool's behavior: # Explicit suggest mode (default) codex --approval-mode suggest "Refactor the utils module"

When using **Full Auto** mode, Codex applies network-disabled, directory-scoped sandboxing. On macOS it uses Apple Seatbelt, on Linux it uses Docker-based isolation: # Full auto with sandboxed execution codex --approval-mode full-auto "Write and run tests for auth.js" ## Step 4: Select Your Model

Codex CLI defaults to o4-mini but supports other OpenAI models. Choose a model based on task complexity: # Use the default o4-mini (fast, cost-effective) codex "Add input validation to the signup form"

Use o3 for complex reasoning tasks

codex —model o3 “Redesign the database schema for multi-tenancy”

Step 5: Generate Your First Code

Navigate to your project directory and run your first real prompt: cd ~/projects/my-app

Generate a new utility function

codex “Create a TypeScript utility function that debounces
any async function with configurable delay and max wait time”

Codex will read your project context, propose a file to create or edit, and show a diff for your approval. Press Enter to accept or Esc to reject.

Interactive Session Example

Launch Codex without a prompt for an interactive multi-turn session: codex

Now type prompts interactively:

> Find all API endpoints that lack authentication middleware

> Add rate limiting to the /api/upload route

Project-Level Configuration with codex.md

Create a codex.md file in your repository root to provide persistent context: # codex.md This is a Next.js 15 project with App Router. Use TypeScript strict mode. Follow the existing patterns in src/lib/. Tests use Vitest. Run tests with: npm run test Database: PostgreSQL via Prisma ORM.

Codex automatically reads this file and follows its instructions on every invocation.

Pro Tips for Power Users

  • Pipe input directly: cat error.log | codex “Explain this error and suggest a fix”- Quiet mode for scripts: Use codex -q “Generate a migration” to print only the final output, ideal for CI pipelines.- Multi-turn context: In interactive mode, Codex retains full conversation context. Build complex changes step by step.- Custom instructions per project: Use codex.md in any subdirectory for scoped instructions that override the root file.- Cost control: Stick with o4-mini for routine tasks. Reserve o3 for architectural decisions or complex debugging.- Git integration: Run Codex inside a Git repo so you can always review diffs with git diff and revert with git checkout .

Troubleshooting Common Errors

ErrorCauseSolution
EACCES: permission deniednpm global install without permissionConfigure npm prefix as shown in Step 1 or use npx @openai/codex
401 UnauthorizedMissing or invalid API keyVerify echo $OPENAI_API_KEY outputs your key correctly
Node.js version not supportedRunning Node.js below v22Install Node.js 22+ via nvm install 22
ECONNREFUSED or network timeoutFirewall or proxy blocking API callsCheck proxy settings: export HTTPS_PROXY=http://your-proxy:port
Sandbox execution fails on LinuxDocker not installed or runningInstall Docker and ensure the daemon is active: sudo systemctl start docker
## Frequently Asked Questions

Is OpenAI Codex CLI free to use?

The CLI tool itself is free and open-source (Apache 2.0 license). However, it requires an OpenAI API key, and API usage is billed based on token consumption. The default model o4-mini is the most cost-effective option for everyday tasks.

Can Codex CLI work with non-JavaScript projects?

Yes. Codex CLI is language-agnostic. It reads your project files regardless of language — Python, Rust, Go, Java, C++, and more are all supported. It analyzes your codebase structure and generates context-appropriate code in whatever language your project uses.

How does the sandbox protect my system in Full Auto mode?

In Full Auto mode, Codex executes commands inside a restricted sandbox. On macOS, it uses Apple’s Seatbelt framework to disable network access and restrict filesystem writes to the current working directory and temporary folders. On Linux, it uses containerized execution via Docker. This prevents any AI-initiated command from accessing the internet or modifying files outside your project scope.

Explore More Tools

Grok Best Practices for Academic Research and Literature Discovery: Leveraging X/Twitter for Scholarly Intelligence Best Practices Grok Best Practices for Content Strategy: Identify Trending Topics Before They Peak and Create Content That Captures Demand Best Practices Grok Case Study: How a DTC Beauty Brand Used Real-Time Social Listening to Save Their Product Launch Case Study Grok Case Study: How a Pharma Company Tracked Patient Sentiment During a Drug Launch and Caught a Safety Signal 48 Hours Before the FDA Case Study Grok Case Study: How a Disaster Relief Nonprofit Used Real-Time X/Twitter Monitoring to Coordinate Emergency Response 3x Faster Case Study Grok Case Study: How a Political Campaign Used X/Twitter Sentiment Analysis to Reshape Messaging and Win a Swing District Case Study How to Use Grok for Competitive Intelligence: Track Product Launches, Pricing Changes, and Market Positioning in Real Time How-To Grok vs Perplexity vs ChatGPT Search for Real-Time Information: Which AI Search Tool Is Most Accurate in 2026? Comparison How to Use Grok for Crisis Communication Monitoring: Detect, Assess, and Respond to PR Emergencies in Real Time How-To How to Use Grok for Product Improvement: Extract Customer Feedback Signals from X/Twitter That Your Support Team Misses How-To How to Use Grok for Conference Live Monitoring: Extract Event Insights and Identify Networking Opportunities in Real Time How-To How to Use Grok for Influencer Marketing: Discover, Vet, and Track Influencer Partnerships Using Real X/Twitter Data How-To How to Use Grok for Job Market Analysis: Track Industry Hiring Trends, Layoff Signals, and Salary Discussions on X/Twitter How-To How to Use Grok for Investor Relations: Track Earnings Sentiment, Analyst Reactions, and Shareholder Concerns in Real Time How-To How to Use Grok for Recruitment and Talent Intelligence: Identifying Hiring Signals from X/Twitter Data How-To How to Use Grok for Startup Fundraising Intelligence: Track Investor Sentiment, VC Activity, and Funding Trends on X/Twitter How-To How to Use Grok for Regulatory Compliance Monitoring: Real-Time Policy Tracking Across Industries How-To NotebookLM Best Practices for Financial Analysts: Due Diligence, Investment Research & Risk Factor Analysis Across SEC Filings Best Practices NotebookLM Best Practices for Teachers: Build Curriculum-Aligned Lesson Plans, Study Guides, and Assessment Materials from Your Own Resources Best Practices NotebookLM Case Study: How an Insurance Company Built a Claims Processing Training System That Cut Errors by 35% Case Study