A VSCode extension that provides a chat interface for the OpenAI Codex CLI, @ file or folder to Codex Just one-click. You can use your own API key.
Note
Official Codex VSCode extension release after this Codexia release a few days.
- π€ AI Chat Interface: Chat with Codex AI directly in VSCode sidebar
- βοΈ Easy Configuration: UI-based setup for OSS mode, models, and providers
- π Real-time Streaming: See responses as they're generated - show_raw_agent_reasoning=true
- β‘ Command Execution: Approve/deny AI-suggested command executions
- π¨ VS Code Theme Integration: Matches your VS Code theme
- π Workspace Aware: Automatically uses current workspace context
- π οΈ Protocol-based: Uses Codex CLI's native protocol for reliable communication
Downloads from githubb release
- codex proto:
codex -c model_provider=oss -m gpt-oss:20b proto
- Process Communication: JSON-RPC, stdin/stdout streams
- Codex CLI installed and available in PATH
- VS Code version 1.103.0 or higher
- An API key for your preferred AI provider (OpenAI, Anthropic, etc.)
- Clone this repository:
git clone https://github.com/milisp/codexia-vscode
cd codexia-vscode
- Install dependencies:
pnpm install
- Compile the extension:
pnpm run compile
- Package the extension (optional):
npx vsce package
- Install in VS Code:
- Open VS Code
- Go to Extensions view (Ctrl+Shift+X)
- Click "..." menu > "Install from VSIX..."
- Select the generated
.vsix
file
- Open the project in VS Code
- Press F5 to open Extension Development Host
- The extension will be loaded in the new window
- Configure Codex CLI: Make sure you have Codex CLI installed and configured with your preferred AI provider:
# Check if codex is installed
codex --version
# Test basic functionality
codex -h
- Set up API Keys: Configure your API key for your chosen provider:
# For OpenAI
export OPENAI_API_KEY="your-api-key"
# For Anthropic Claude
export ANTHROPIC_API_KEY="your-api-key"
# For other providers, check Codex CLI documentation
-
Open Codexia: Click the robot icon in the VS Code activity bar to open the Codexia panel
-
Configure Settings:
- Go to the Settings tab in the Codexia panel
- Choose your preferred setup:
- β
OSS Mode (
--oss
): Use local models via Ollama (likellama3.2
) - π€ Cloud Providers: OpenAI, Anthropic, Gemini, etc.
- βοΈ Advanced: Custom arguments, approval policies, sandbox modes
- β
OSS Mode (
-
Start Chatting:
- Go to the Chat tab
- Type your message and press Enter or click send
-
Approve Commands: When Codex suggests running commands, you'll see approval buttons:
- β Approve: Allow the command to execute
- β Deny: Reject the command execution
-
Manage Chat:
- New Task: Click the "+" button to start fresh
- Clear History: Click the clear button to reset the chat
The extension provides these VS Code commands:
codexia.newTask
- Start a new conversationcodexia.openSettings
- Open extension settingscodexia.clearHistory
- Clear chat history
The extension provides a user-friendly settings interface:
- Open the Settings tab in the Codexia panel
- Configure your preferences:
- OSS Mode: Enable for local models (
codex --oss -m llama3.2
) - Model: Choose from available models or enter custom
- Provider: Select OpenAI, Anthropic, Gemini, etc.
- Approval Policy: Control command execution approval
- Sandbox Mode: Set file access permissions
- Custom Args: Add additional command line arguments
- OSS Mode: Enable for local models (
For local Ollama models:
# Equivalent to: codex --oss -m llama3.2 proto
β
OSS Mode: Enabled
π± Model: llama3.2
For OpenAI GPT-5:
# Equivalent to: codex -c model_provider=openai -m gpt-5 proto
β
OSS Mode: Disabled
π€ Provider: openai
π± Model: gpt-5
For Anthropic Claude:
# Equivalent to: codex -c model_provider=anthropic -m claude-3-5-sonnet-20241022 proto
β
OSS Mode: Disabled
π€ Provider: anthropic
π± Model: claude-3-5-sonnet-20241022
You can still use Codex CLI configuration files if needed:
# Set default model
codex -c model="gpt-5"
# Set provider
codex -c model_provider="anthropic"
# Set approval policy
codex -c approval_policy="on-request"
The extension consists of:
- Extension Host (
src/extension.ts
): Main extension entry point - Chat Provider (
src/chatProvider.ts
): Manages the webview and user interactions - Codex Service (
src/codexService.ts
): Handles communication with Codex CLI via protocol mode - Webview UI (
media/
): HTML/CSS/JS for the chat interface
# Make sure codex is in your PATH
which codex
# or
codex --version
If you see "Codex session timeout" errors:
- Check your internet connection
- Verify your API key is valid
- Try restarting the extension
If execution requests aren't working:
- Check your Codex CLI approval policy configuration
- Make sure you have necessary permissions in the workspace
src/
βββ extension.ts # Extension entry point
βββ chatProvider.ts # Webview provider
βββ codexService.ts # Codex CLI integration
media/
βββ main.css # Webview styles
βββ main.js # Webview JavaScript
βββ reset.css # CSS reset
βββ vscode.css # VS Code theme variables
# Development build
pnpm run compile
# Watch mode
pnpm run watch
# Package
npx vsce package
# Test Codex availability
node test-codex.js
# Test protocol communication
node test-protocol.js
- Fork the repository
- Create a feature branch
- Make your changes
- Test thoroughly
- Submit a pull request
- Codexia Tauri v2 GUI for the OpenAI Codex CLI
- Inspired by Cline for the UI design patterns
- Built on top of Codex CLI for AI integration
- Uses VS Code's extension API and webview system
- β Basic chat integration
- π§ @Files from FileTree support
- π§ Note system
- π§ Advanced configuration options