OpenCode Installation Guide
OpenCode is distributed effectively as a Node.js CLI tool. This guide covers installation for macOS and Linux environments, which are the primary supported platforms. Windows users are recommended to use WSL2.
Prerequisites
Before installing, ensure you have the following on your machine:
- Node.js: Version 18.0.0 or higher.
- Git: For cloning capabilities.
- Docker (Optional): Only required if you plan to run agents in sandboxed containers.
Installing via NPM
The recommended way to install OpenCode is via npm or pnpm.
# Install globally
npm install -g @opencode-ai/cli
# Verify installation
opencode --version
If you see a version number (e.g., v0.12.0), you are ready to go.
Troubleshooting Permissions
If you encounter EACCES errors, avoid using sudo. Instead, use a node version manager like nvm or set up your npm prefix correctly.
Initial Configuration
Once installed, you need to initialize the configuration. This creates a .opencode config file in your home directory.
opencode init
This interactive command will ask you:
- Default Model Provider: Choose 'Ollama' (Local) or 'OpenAI/Anthropic' (Cloud).
- Sandbox Mode: Recommended 'On' for extensive refactoring tasks.
- Plugins: Select default plugins to install.
Setting Up API Keys
If you are not using Local Models, you will need to set API keys. OpenCode respects standard environment variables.
Add these to your ~/.zshrc or ~/.bashrc:
# For Anthropic models
export ANTHROPIC_API_KEY="sk-ant-..."
# For OpenAI models
export OPENAI_API_KEY="sk-..."
Reload your shell:
source ~/.zshrc
Verifying the Setup
Let's run a simple "Hello World" task to ensure the agent can execute commands.
opencode run "Create a file named hello.txt and write 'OpenCode was here' into it"
If successful, you should see:
- Thoughts streaming in the terminal.
- A file creation confirmation.
- The
hello.txtfile in your directory.
Common Pitfalls
- Node Version incompatibility: OpenCode uses modern Node fetch APIs. Ensure you are on Node 18+.
- Ollama Connection: If using local models, ensure Ollama is serving on port 11434. See OpenCode + Ollama for details.
Next Steps
Now that you have the CLI running:
- Learn basic CLI Commands to become efficient.
- Configure your first MCP Server to give the agent tools.