OCCode CLI — User Guide

1. Introduction

What is OCCode CLI?

OCCode CLI is a terminal-based AI coding assistant developed by OpenCan.ai. It brings the power of advanced language models directly to your command line with multi-provider support, real-time cost tracking, checkpoint-based undo, and over 60 interactive commands.

Multi-Provider AI

11 providers, 40+ models including Claude, GPT, Gemini, DeepSeek, Ollama & more

Checkpoint System

Save and restore file states for safe undo/rollback of any change

Cost Tracking

Real-time token usage and cost display across all providers

Session Persistence

Resume conversations without losing context

Convergence Engine

Multi-model ensemble with Merge, Vote, Debate, and Review strategies

Git Integration

AI-generated commit messages, enhanced diffs, auto-stage

System Requirements

RequirementDetails
Node.js18.0 or higher (for npm install method)
OSWindows 10+, macOS 12+, Linux (x64/arm64)
Disk Space~50MB (binary) or ~5MB (npm)
NetworkRequired for cloud AI providers; optional for Ollama/local

2. Installation

Download (Account Required)

Download OCCode from the Install Guide. An opencan.ai account is required. A valid license key is required to activate the application.

PlatformFileSizeSHA256 Checksum
Windows x64occode-windows-x64.zip47 MB7f4840e8355a2ec64adfe0e667f1486acf19c5313fbb49dcc3c750636307b85a
macOS Inteloccode-macos-x64.zip51 MBb62c578b38e771961b53b75f3851edb816ce0339155251af7bf9cf0847640c24
macOS Apple Siliconoccode-macos-arm64.zip51 MB53befd5d8d381e24d1bb85072e6332e565350973595cea4210bb7fd227a0e3b5
Linux x64occode-linux-x64.zip50 MBfe27a7e04d1d6aa8511de2e23436f6f2891ad01de8d8cffb84cead12f790b965
Install Procedure:
  1. Download the zip file for your platform from the Install Guide
  2. Obtain your license key from your opencan.ai dashboard
  3. Verify the SHA256 checksum to confirm the download was not corrupted
  4. Extract the zip file and place the binary in your PATH
  5. Activate with: occode activate --key YOUR-LICENSE-KEY
# Linux/macOS
unzip occode-linux-x64.zip # or occode-macos-*.zip
sudo mv occode /usr/local/bin/

# Windows - right-click zip → Extract All, then add folder to PATH

# Verify checksum (Linux)
sha256sum occode-linux-x64.zip

# Activate with your license key
occode activate --key YOUR-LICENSE-KEY

Verify Installation

occode --version
occode --help

3. Quick Start

Step 1: Configure Your API Key

# Interactive configuration wizard
occode config

# Or set directly
occode config --set-key --provider anthropic

Step 2: Start an Interactive Session

occode

This launches the REPL (Read-Eval-Print Loop) where you can chat with the AI and give coding instructions.

Step 3: Run a One-Shot Task

# Execute a task and exit
occode run "Add unit tests for UserService"

# Autonomous mode (no approval prompts)
occode run "Fix all TypeScript errors" --yes

# With specific files in context
occode run "Refactor this component" -c src/App.tsx

Step 4: Understanding the Interface

# The REPL shows a timestamp prompt:
2026-02-10 14:30:12 EST your message here

# Use slash commands for quick actions:
/help # Show all commands
/cost # Show token costs
/model # Show current model
/status # Show session info

4. Core Commands

occode / occode chat — Interactive Mode

occode [chat]
  -m, --model <model> # Model to use (e.g. claude-sonnet-4-20250514)
  -p, --provider <provider> # AI provider (anthropic, openai, etc.)
  -c, --context <files...> # Add files to context
  --session <id> # Resume previous session

occode run <task> — Single Task Execution

occode run <task>
  -y, --yes # Auto-approve all actions (autonomous)
  -s, --supervised # Approve all file changes
  -m, --model <model> # Model to use
  -c, --context <files...> # Add files to context
  --dry-run # Preview without executing
  --max-turns <n> # Maximum agent turns (default: 50)
  --timeout <seconds> # Timeout limit

occode watch — Watch Mode

occode watch
  -p, --pattern <glob> # File pattern to watch
  -i, --ignore <patterns> # Patterns to ignore
  -t, --task <task> # Task to run on changes

occode config — Configuration

occode config
  --set <key=value> # Set config value
  --get <key> # Get config value
  --list # List all settings
  --reset # Reset to defaults
  --set-key # Set API key securely (keychain)

occode commit — AI Git Commit

occode commit
  -a, --all # Stage all changes
  -p, --push # Push after commit

occode explain — Explain Code

occode explain [file]
  -d, --detailed # Detailed explanation
  --architecture # Explain project structure

occode review — Code Review

occode review [path]
  --security # Focus on security
  --performance # Focus on performance
  --diff # Review staged changes

occode search — Semantic Search

occode search <query>
  -n, --limit <n> # Max results
  -t, --type <type> # Symbol type filter

occode generate — Generate Code/Docs

occode generate <type>
  # Types: readme, tests, docs, types

occode checkpoint — Manage Checkpoints

occode checkpoint list # List all checkpoints
occode checkpoint create <msg> # Create named checkpoint
occode checkpoint restore <id> # Restore to checkpoint
occode checkpoint delete <id> # Delete checkpoint

occode undo — Undo Last Change

occode undo

occode history — Session History

occode history
  -n, --limit <n> # Number of sessions
  --clear # Clear all history
  --export <file> # Export to file
  --import <file> # Import from file

Other Commands

occode init # Initialize in current directory
occode status # Show current status
occode update # Check for updates

5. Interactive Mode (REPL)

While in chat mode, use these slash commands:

Provider & Model Commands

/provider [name] # Show current or switch provider
/model [name] # Show current or change model
/api_key [key] # Show status or set API key
/api_url [url] # Show or change custom API endpoint
/api_url reset # Reset to default endpoint

Model Profile Commands (12)

/profile # Show active profile and list all
/profile <name> # Activate a profile
/profile create <name> # Create new profile interactively
/profile template <name> # Create from built-in template
/profile templates # List available templates
/profile edit <name> # Edit existing profile
/profile delete <name> # Delete a profile
/profile off # Deactivate current profile
/profile default <name> # Set default profile (auto-activates on startup)
/profile prefix <name> <prefix> # Set message prefix trigger
/profile export # Export profiles to JSON
/profile import <file> # Import profiles from JSON

Convergence Commands (18)

/converge # Show convergence status
/converge on | enable # Enable convergence mode
/converge off | disable # Disable convergence mode
/converge strategy <name> # Set strategy (merge/vote/debate/review)
/converge add <alias> # Add model by short alias
/converge remove <name> # Remove a participant model
/converge aggregator <name> # Set aggregator model for synthesis
/converge preset <name> # Load preset configuration
/converge presets # List available presets
/converge rounds <n> # Set debate rounds (1-5)
/converge show on|off # Show/hide individual model outputs
/converge models # List configured participant models
/converge catalog # Browse all models in catalog
/converge available # Show models with API keys detected
/converge search <keyword> # Search models
/converge last # Show stats from last run
/converge reset # Reset to defaults
/converge export|import # Export/import configuration

Context Management (6)

/context # Show context overview with token usage
/context add <file> # Add file to context
/context remove <file> # Remove file from context
/pin <file> # Pin file to persistent context
/unpin <file> # Remove pin from file
/exclude <pattern> # Exclude files matching glob
/include <pattern> # Remove exclusion pattern

Session Management (7)

/clear # Clear conversation history
/status # Show session statistics
/cost # Show detailed cost breakdown
/compact # Compact history to save tokens
/export <file> # Export session to JSON
/debug # Toggle debug mode
/mode interactive|auto # Set execution mode

Git Integration (3)

/git # Show git status
/commit # Generate AI-powered commit message
/diff <files> # Enhanced diff with syntax highlighting
/diff --side-by-side <files> # Side-by-side comparison

Checkpoint & Undo (4)

/checkpoint [message] # Create named checkpoint
/checkpoint list # List all checkpoints
/checkpoint restore <id> # Restore to specific checkpoint
/undo # Undo last file change

Daemon & Indexing (5)

/index # Show indexing status
/index rebuild # Force rebuild codebase index
/daemon # Show daemon status
/daemon start|stop|restart # Manage background daemon

Subscription (3)

OCCode includes a 7-day free trial with full access. Create an opencan.ai account and choose a plan to start your trial — you won't be charged until day 8. After the trial, a license key is required. Cancel before day 8 to avoid charges. No refunds after billing.

Important: Your license key must be activated immediately upon purchase. Your free trial ends on day 7 and your subscription will be charged on day 8.

Team+ plans: For admin-generated license keys, the 7-day trial period begins at the time each key is generated. After 30 days from the initial Team+ subscription purchase, newly generated license keys will no longer include a trial period.

/subscription # Show subscription status (trial days remaining or plan info)
/subscription activate <key> # Activate license key
/subscription plans # View available plans

Feature Toggles

/features # Show all features and status
/features enable <feature> # Enable a feature
/features disable <feature> # Disable a feature
/features cost # Show token cost impact
/features reset # Reset to defaults
/tdg # Toggle Test-Driven Generation
/autofix # Toggle Auto-Fix in LSP Loop

Help & Transcripts

/help | /h | /? # Show all commands
/transcripts # Toggle transcript saving
/transcripts status # Show transcript configuration
/transcripts export [path] # Export to unencrypted JSON
/transcripts clear # Clear all entries

Keyboard Shortcuts

KeyAction
Ctrl+CStop current operation
Ctrl+DExit session
Up/DownNavigate command history
TabAuto-complete file paths and commands

6. AI Providers

OCCode supports 11 AI providers with 40+ models.

ProviderSpeedQualityCostKey Required
Anthropic (Claude)FastExcellent$$$Yes
OpenAI (GPT)FastExcellent$$$Yes
DeepSeekFastExcellent$Yes
Google (Gemini)MediumGood$$Yes
Mistral AIFastGood$$Yes
GroqUltra FastGoodFree*Yes
Together AIFastGood$$Yes
OpenRouterVariesVariesVariesYes
Ollama (Local)MediumGoodFreeNo
OpenCanVariesVariesVariesYes
Custom/LocalVariesVariesVariesOptional

Anthropic (Claude) Setup

occode config --set provider=anthropic
occode config --set model=claude-sonnet-4-20250514
occode config --set-key --provider anthropic
# Get key from: https://console.anthropic.com

OpenAI (GPT) Setup

occode config --set provider=openai
occode config --set model=gpt-4o
occode config --set-key --provider openai
# Get key from: https://platform.openai.com

DeepSeek Setup

occode config --set provider=deepseek
occode config --set model=deepseek-coder
occode config --set-key --provider deepseek
# Get key from: https://platform.deepseek.com

Ollama (Local - Free, Offline)

# 1. Install Ollama: https://ollama.ai/download
# 2. Pull a model:
ollama pull llama3.3

# 3. Configure OCCode:
occode config --set provider=local
occode config --set apiEndpoint=http://localhost:11434/v1
occode config --set model=llama3.3

OpenRouter (100+ Models via One Key)

occode config --set provider=openrouter
occode config --set model=anthropic/claude-sonnet-4
occode config --set-key --provider openrouter
# Get key from: https://openrouter.ai/keys

Recommended Setups

User TypePrimaryFallback
Professional DevAnthropic + Claude SonnetOpenRouter + DeepSeek R1
Student/LearnerGroq + Llama 3.3 70B (free)Ollama + Llama 3.3 (free)
Team/EnterpriseOpenCan (centralized billing)Anthropic or OpenAI
Budget-ConsciousDeepSeek + deepseek-coder ($)Ollama (free)

7. Model Profiles

Profiles are named model configurations for quick switching.

Built-in Templates

TemplateProviderModelUse Case
fastAnthropicClaude HaikuQuick responses, low cost
powerAnthropicClaude OpusComplex reasoning
creativeAnthropicHigh-temp SonnetCreative tasks
gptOpenAIGPT-4 TurboAlternative perspective
localOllamaLocal modelsOffline, private

Usage

# Create from template
/profile template fast

# Activate
/profile fast

# Create custom profile
/profile create my-custom

# Set prefix trigger (e.g., "quick: your message" auto-activates fast profile)
/profile prefix fast quick

# Set a default profile
/profile default fast

# Export/import profiles for team sharing
/profile export
/profile import profiles.json

8. Convergence Engine

The Convergence Engine runs your query through multiple AI models simultaneously and synthesizes the best response.

4 Strategies

StrategyHow It WorksBest For
Merge (MoA)Parallel generation + synthesisMaximum quality
VoteDemocratic selection via model votingConsensus decisions
DebateMulti-round critique and refinementThorough analysis
ReviewGenerate + review + revise workflowCode quality

Built-in Presets

/converge preset duo-merge # 2 models with merge
/converge preset trio-merge # 3 models for max quality
/converge preset code-review # Cost-effective review workflow
/converge preset debate # Thorough analysis
/converge preset vote # Democratic consensus
/converge preset local-merge # Zero API cost (local models)

Custom Configuration

/converge on # Enable convergence
/converge strategy merge # Set strategy
/converge add sonnet # Add Claude Sonnet
/converge add gpt4 # Add GPT-4
/converge aggregator opus # Use Opus as synthesizer
/converge rounds 3 # Set 3 debate rounds

9. Context Management

Adding Files to Context

/context add src/api/routes.ts
/context add src/**/*.ts # Glob patterns supported

Pinning Files (Persistent Across Sessions)

/pin src/types.ts # Always in context
/unpin src/types.ts # Remove pin

Excluding Files

/exclude "*.test.ts"
/exclude "**/node_modules/**"
/include "*.test.ts" # Remove exclusion

Context Overview

/context
# Output:
# Context Overview:
# Working directory: /home/user/my-project
# Token usage: 12,450 / 128,000 (9.7%)
# [###.............................]
#
# Pinned files:
# src/types.ts (2,500 tokens)
# Active context files:
# src/api/routes.ts (1,800 tokens)
# ... and 15 more files

Automatic Detection

OCCode automatically detects project type (Node.js, Python, Rust, Go, Java), frameworks (React, Django, Express), and loads relevant context.

10. Session Management

Automatic Persistence

Sessions are automatically saved to ~/.occode/sessions/. Each session gets a unique ID.

Resume a Previous Session

occode --session sess_abc123

Session Commands

/status # Messages, tokens, cost, turns
/clear # Reset conversation (current session)
/compact # Summarize to save tokens
/export session.json # Save session to file

History Management

occode history # View recent sessions
occode history -n 10 # Last 10 sessions
occode history --export sessions.json
occode history --clear # Delete all history

11. Checkpoints & Undo

Checkpoints save file states before changes so you can always roll back.

Creating Checkpoints

/checkpoint Before refactoring auth module
# or
occode checkpoint create "Before refactoring"
Tip: Checkpoints are automatically created before every destructive operation (write, edit, delete).

Restoring

/checkpoint list # See all saved states
/checkpoint restore ckpt_123 # Restore to that state

Quick Undo

/undo # Reverts last file change
# or
occode undo

12. Cost Tracking

Viewing Costs

/cost
# Output:
# Cost Report
# ────────────────────────────
# Provider: anthropic
# Model: claude-sonnet-4-20250514
#
# Token Usage:
# Input: 12,450 tokens
# Output: 3,200 tokens
# Total: 15,650 tokens
#
# Total Cost: $0.0856

Model Tiers

TierExamplesCost
FlagshipOpus, GPT-4, Gemini Pro$$$
BalancedSonnet, o3-mini$$
FastHaiku, GPT-4o-mini, Gemini Flash$
EconomyOllama, local modelsFree
Budget Tip: Use /profile fast for simple tasks and /profile power only when needed. Switch to Ollama for unlimited free usage on private/offline tasks.

13. Git Integration

AI-Powered Commits

/commit
# OCCode analyzes your diff and generates:
# "feat: Add user authentication with JWT tokens"
# Approve? [y/N]

Enhanced Diff

/diff src/auth.ts # Syntax-highlighted diff
/diff --side-by-side src/auth.ts # Side-by-side view
/diff --no-syntax src/auth.ts # Plain text diff

Git Status

/git
# Shows branch, staged files, unstaged changes

CLI Commit Command

occode commit # Interactive commit
occode commit -a # Stage all + commit
occode commit -a -p # Stage all + commit + push

14. Feature Toggles

OCCode includes powerful features that use additional AI tokens. Toggle them on/off to control costs.

High Token Impact HIGH

FeatureCommandDefaultDescription
Test-Driven Generation/tdgOffGenerate tests + iterate until passing (3-10x tokens)
Visual UI Repair/features enable visualUIRepairOffScreenshot analysis + auto-fix UI issues
Browser E2E Testing/features enable browserTestingOffPuppeteer/Playwright auto-test generation
Auto Code Review/features enable autoCodeReviewOffAutomatic PR review + suggestions

Medium Token Impact MEDIUM

FeatureCommandDefault
Proactive Monitoring/features enable proactiveMonitoringOff
Coverage-Guided Tests/features disable coverageGuidedTestsOn
Auto Documentation/features enable autoDocumentationOff
Refactoring Suggestions/features enable refactoringSuggestionsOff
Performance Optimization/features enable performanceOptimizationOff

Low Token Impact LOW

FeatureCommandDefault
Auto-Fix (LSP Loop)/autofixOn

Token Cost Multiplier

/features cost
# Shows: Token Multiplier: 1.2x (baseline + coverage-guided tests)
# With TDG enabled: 5.0x
# With multiple features: up to 10x+
Warning: Enabling TDG + Visual UI + Code Review can increase token costs 10x+. Enable features only when needed.

15. P2P Messaging

OCCode includes end-to-end encrypted (E2EE) peer-to-peer messaging directly in the CLI. Send secure messages to other OCCode users without leaving your terminal. All messages are encrypted client-side — the server stores only ciphertext and cannot read message content.

Quick Start
1. Run /dm keys to generate your encryption keys
2. Run /dm search username to find a user
3. Run /dm open <userId> to start a conversation
4. Type your message and press Enter — it's encrypted and sent automatically
5. Run /dm close to exit DM mode and return to AI chat

DM Commands Reference

CommandDescription
/dm keysGenerate or show your E2EE keypair and fingerprint
/dm rotate-keysGenerate a new keypair and revoke the old one
/dm search <query>Search for users by name or email
/dm open <userId>Open or create a conversation with a user
/dm listList all active conversations with unread counts
/dm read [convId]Show message history for a conversation
/dm send <text>Send a message in the active conversation
/dm closeClose the active DM session and return to AI chat
/dm statusShow WebSocket connection status and offline queue
/dm transcript on|off|exportEnable, disable, or export DM transcripts

DM Chat Mode

When you open a conversation with /dm open, OCCode enters DM chat mode. In this mode:

End-to-End Encryption

OCCode uses industry-standard cryptographic algorithms for message security:

ComponentAlgorithmPurpose
Key ExchangeX25519 (Curve25519)Establish shared secret between users
Message EncryptionXSalsa20-Poly1305Authenticated encryption (TweetNaCl)
Key FingerprintSHA-256Verify key identity

Key Management

Your encryption keys are stored securely:

Important
Your private key never leaves your device. If you lose your keys (e.g., switch machines), you'll need to generate new keys with /dm keys or /dm rotate-keys. Previous messages remain readable as long as both parties' keys are available.

Key Fingerprint Verification

When you run /dm keys, OCCode displays your key fingerprint as a colon-separated hex string (e.g., a3:4f:b2:...). You can verify this fingerprint with your contact through an out-of-band channel to confirm you're communicating with the right person.

DM Transcripts

You can optionally save transcripts of your DM conversations:

/dm transcript on       # Enable transcript logging
/dm transcript off      # Disable transcript logging
/dm transcript export   # Export decrypted transcript as JSON

Transcripts are stored locally at ~/.occode/dm-transcripts/ and encrypted with AES-256-GCM using a password-derived key.

Organization Policy
Your organization administrator may require DM transcripts to be enabled for compliance. When enforced, you cannot disable transcripts — the /dm transcript off command will show a policy notice.

Offline Message Queue

If you send a message while disconnected, it's saved to a local queue (~/.occode/dm-offline-queue.json, max 100 messages). Messages are automatically sent when your connection is restored.

Check connection status anytime with /dm status.

Message Features

15.1. Multi-Channel Communications

Beyond P2P messaging, OCCode includes a multi-channel communications module that lets you send and receive messages over 8 independent channels — from Bluetooth on your desk to LoRa radios and ham radio spanning continents. Each channel is optional and only activates when configured.

Quick Start
1. Install the npm package for your channel (e.g. npm install bluetooth-serial-port)
2. Edit ~/.occode/comms.json to enable and configure the channel
3. Run /comms status to verify the connection
4. See the full Communications Guide for detailed setup instructions

Channel Overview

Bluetooth

Serial Port Profile (SPP) over paired devices. Optional AES-256-GCM encryption.

WiFi LAN

mDNS peer discovery + TCP messaging on your local network.

Modem SMS

USB GSM modem with AT commands. Send/receive real SMS.

Telegram

Bot API with long polling and optional chat-ID whitelist.

Cloud SMS

Twilio REST API for outbound + webhook for inbound SMS.

LoRa Radio

Long-range radio (RYLR896/998, RAK, RN2483). 2–15 km range.

Meshtastic

LoRa mesh with multi-hop routing and built-in AES-256.

Ham Radio

AX.25 packet radio + JS8Call weak-signal HF. Requires license.

Comms Commands Reference

CommandDescription
/comms statusShow all channel connection statuses
/comms bt scan|sendBluetooth device scan and messaging
/comms wifi peers|sendWiFi LAN peer discovery and messaging
/comms modem ports|send|signalGSM modem SMS and signal check
/comms tg sendTelegram bot messaging
/comms sms send|statusCloud SMS via Twilio
/comms lora send|infoLoRa radio messaging
/comms mesh send|nodes|infoMeshtastic mesh networking
/comms ham send|freq|mode|infoHam radio (AX.25 / JS8Call)
/comms history [n]Recent messages across all channels
/comms configConfiguration instructions
Full Documentation
For detailed per-channel setup, configuration examples, hardware requirements, and testing instructions, see the OCCode Communications Guide.

15.5. Plugin System

OCCode CLI supports plugins — sandboxed extensions that add custom slash commands, hook into lifecycle events, and access AI and context features through a permission-gated API. Plugins run in isolated VM contexts and cannot access the filesystem or Node.js modules directly.

Quick Start
1. Install a plugin: /plugin install /path/to/my-plugin
2. Enable it: /plugin enable my-plugin
3. Use its commands: /my-plugin:command-name [args]
4. Disable when done: /plugin disable my-plugin

Plugin Management Commands

CommandDescription
/plugin or /plugin listList all installed plugins with their current state
/plugin install <path>Install a plugin from a local directory
/plugin enable <name>Enable a plugin (loads it in its sandbox)
/plugin disable <name>Disable a plugin (keeps it installed)
/plugin uninstall <name>Uninstall and remove a plugin completely
/plugin info <name>Show plugin details: version, permissions, commands
/plugin helpShow plugin command help

Using Plugin Commands

Plugin commands use the syntax /<plugin-name>:<command-name> [arguments]:

# Run the "check" command from "my-linter" plugin
/my-linter:check src/

# Run the "format" command from "formatter" plugin
/formatter:format --style=prettier

# Run the "run" command from "test-runner" plugin
/test-runner:run --watch

Plugin States

StateDescription
installedPlugin files copied to disk, not yet active
enabledPlugin loaded in sandbox, commands available
disabledPlugin installed but not active
errorPlugin failed to load (check /plugin info for details)

Creating Plugins

Plugins are defined by a plugin.json manifest and a JavaScript entry point file.

Plugin Manifest (plugin.json)

{
  "name": "my-plugin",
  "version": "1.0.0",
  "displayName": "My Plugin",
  "description": "Does something useful",
  "author": "Your Name",
  "entryPoint": "dist/index.js",
  "permissions": ["commands", "context:read"],
  "commands": [
    {
      "name": "greet",
      "description": "Say hello",
      "handler": "greetHandler"
    }
  ],
  "hooks": [
    {
      "event": "command:execute",
      "handler": "onCommandExecute"
    }
  ]
}

Entry Point Example (dist/index.js)

// Export command handlers
module.exports.greetHandler = async function(args) {
  const name = args || 'World';
  return `Hello, ${name}!`;
};

// Export hook handlers
module.exports.onCommandExecute = async function(data) {
  api.log('Command executed:', data);
};

Available Permissions

PermissionWhat It Grants
commandsDefine custom slash commands
context:readRead workspace context files via api.context.read(path)
context:writeWrite workspace context via api.context.write(path, content)
ai:callCall AI completions via api.ai.complete(prompt)
messages:readRead conversation messages via api.messages.read()
messages:writeWrite messages via api.messages.write(content)
networkMake HTTPS requests via api.network.fetch(url)
hooks:preRegister pre-execution hooks
hooks:postRegister post-execution hooks
uiModify display elements
dm:readRead direct messages (may be blocked by org policy)
dm:writeSend direct messages (may be blocked by org policy)
transcripts:readRead transcripts (may be blocked by org policy)
Organization Restrictions
Your organization administrator may restrict which plugins can be installed and which permissions are available. If a plugin requires a blocked permission, that capability will be silently disabled. Check with your admin if a plugin isn't working as expected.

Plugin File Structure

my-plugin/
  plugin.json        # Required: manifest with metadata and permissions
  dist/
    index.js         # Required: entry point (referenced by entryPoint field)
    helpers.js       # Optional: additional code (bundled into entry point)

Plugins are stored at ~/.occode/plugins/ after installation. The plugin registry is tracked in ~/.occode/plugins/plugins.json.

Security Notes

16. Configuration Reference

Global Config: ~/.occode/config.json

{
  "provider": "anthropic",
  "model": "claude-sonnet-4-20250514",
  "maxTokens": 4096,
  "temperature": 0.7,
  "autoApprove": false,
  "features": {
    "tdg": false,
    "autoFix": true,
    "coverageGuidedTests": true
  }
}

Project Config: .occode.json

{
  "provider": "anthropic",
  "model": "claude-sonnet-4-20250514",
  "mode": "interactive",
  "contextPatterns": ["src/**/*"],
  "ignorePatterns": ["node_modules", "dist"]
}

Environment Variables

VariableDescriptionExample
OCCODE_PROVIDERAI provideranthropic
OCCODE_MODELModel nameclaude-sonnet-4-20250514
OCCODE_API_ENDPOINTCustom endpointhttp://localhost:11434/v1
OCCODE_MAX_TOKENSMax output tokens4096
OCCODE_TEMPERATURETemperature (0-1)0.7
OCCODE_API_KEYFallback API key
ANTHROPIC_API_KEYAnthropic keysk-ant-...
OPENAI_API_KEYOpenAI keysk-...
GOOGLE_API_KEYGoogle/Gemini key
DEEPSEEK_API_KEYDeepSeek key
MISTRAL_API_KEYMistral key
GROQ_API_KEYGroq key
TOGETHER_API_KEYTogether AI key
OPENROUTER_API_KEYOpenRouter keysk-or-v1-...
OPENCAN_API_KEYOpenCan key

Configuration Priority (highest to lowest)

  1. Command-line flags (--model, --provider)
  2. Environment variables (OCCODE_MODEL)
  3. Project config (.occode.json)
  4. Global config (~/.occode/config.json)
  5. Built-in defaults

17. Troubleshooting

API Key Not Found

occode config --set-key --provider anthropic
# Or set via environment:
export ANTHROPIC_API_KEY="sk-ant-..."

Model Not Available

# Check configuration:
occode config --list

# Verify model name for your provider:
/model # Shows current model
/converge catalog # Browse all available models

Command Timeout

occode run "task" --timeout 600 # 10 minutes

Local Model Connection (Ollama)

# Ensure Ollama is running:
curl http://localhost:11434/api/tags

# Start if needed:
ollama serve

# Check available models:
ollama list

Session Won't Resume

# Check session ID:
occode history

# Sessions stored at:
ls ~/.occode/sessions/

High Token Usage

/features cost # Check token multiplier
/features # See which features are enabled
/compact # Compress conversation history

18. Tips & Best Practices

1. Be Specific: Clear, detailed instructions produce much better results than vague requests.
2. Use Context: Add relevant files with /context add or -c flag so the AI understands your code.
3. Monitor Costs: Check /cost regularly. Use /profile fast for simple tasks to save money.
4. Use Checkpoints: Before risky operations, create a checkpoint with /checkpoint "before big refactor".
5. Preview First: Use --dry-run to see what the AI would do before executing.
6. Compact Long Sessions: Use /compact to summarize conversation history and reduce token usage.
7. Use Profiles: Set up model profiles for different tasks: fast for quick queries, power for complex work.
8. Local for Privacy: Use Ollama for sensitive code that should never leave your machine.

OCCode CLI User Guide — Version 1.1 — February 14, 2026

© 2025-2026 OpenCan.ai — All Rights Reserved

Source references: occode/README.md, occode/COMPLETE_FEATURE_LIST.md, occode/CONFIGURATION.md, occode/AI_PROVIDERS.md, occode/ENVIRONMENT_VARIABLES.md, occode/DAEMON_ARCHITECTURE.md, occode/FEATURE_TOGGLES.md, occode/OCCODE_CLI_DESIGN_DOCUMENT.md, occode/src/cli/run.ts

Contact · Privacy · Terms · Back to OpenCan.ai

Login Admin Dashboard
Products and Docs
OCCode CLI User Guide OCCode CLI Admin Guide OCCode CLI Install guide OCCode CLI Full Documentation OCCode Pricing
Blog
AI Articles Technical Topics Applications
About Us Terms of Service Privacy Policy Contact Us