-
Notifications
You must be signed in to change notification settings - Fork 0
Configuration
Status: ✅ Complete
Last Updated: December 3, 2025
RiceCoder is highly configurable, allowing you to customize behavior, appearance, and integrations to match your workflow. This guide covers all configuration options, file locations, environment variables, and the configuration hierarchy.
RiceCoder loads configuration in priority order (highest to lowest):
- Runtime Overrides - CLI flags and environment variables
-
Project Configuration -
.agent/config.yamlin your project -
User Configuration -
~/.ricecoder/config.yamlin your home directory - Built-in Defaults - Hardcoded defaults in RiceCoder
Each level overrides the previous one. For example, if you set provider: openai in your user config but provider: ollama in your project config, the project config takes precedence.
# ~/.ricecoder/config.yaml (User level)
provider: openai
model: gpt-4
theme: dracula
# .agent/config.yaml (Project level)
provider: ollama
model: mistral
# Result: provider=ollama (project overrides user), model=mistral (project), theme=dracula (from user)Location: ~/.ricecoder/config.yaml
Purpose: User-level settings that apply to all projects
Created by: rice init command
Example:
# ~/.ricecoder/config.yaml
provider: openai
api-key: sk-...
theme: dracula
log-level: infoLocation: .agent/config.yaml in your project root
Purpose: Project-specific settings that override global configuration
Created by: rice init in a project directory
Example:
# .agent/config.yaml
provider: ollama
model: mistral
permissions:
file-write: ask
file-delete: denyRiceCoder uses YAML format for configuration files. YAML is human-readable and supports:
-
Strings:
key: value -
Numbers:
timeout: 30 -
Booleans:
enabled: true -
Lists:
models: [gpt-4, gpt-3.5-turbo] - Objects: Nested configuration with indentation
# Comments start with #
# Strings
name: RiceCoder
# Numbers
timeout: 30
port: 8080
# Booleans
enabled: true
debug: false
# Lists
models:
- gpt-4
- gpt-3.5-turbo
- claude-3-opus
# Objects (nested)
provider:
name: openai
api-key: sk-...
timeout: 30Type: String
Valid Values: openai, anthropic, github-copilot, ollama, custom
Default: openai
Description: Which AI provider to use for code generation and chat
Example:
provider: openaiType: String
Valid Values: Depends on provider (see AI Providers Guide)
Default: gpt-4 (for OpenAI)
Description: Which model to use from the selected provider
Example:
model: gpt-4Type: String
Default: None (required for cloud providers)
Description: API key for the selected provider
Security Note: Never commit this to version control. Use environment variables instead.
Example:
api-key: ${OPENAI_API_KEY} # Use environment variableType: String
Default: http://localhost:11434
Description: URL where Ollama is running
Example:
provider: ollama
ollama-url: http://localhost:11434Type: Number (seconds)
Default: 300
Description: Timeout for Ollama requests
Example:
ollama-timeout: 600Type: String
Valid Values: dracula, nord, solarized, monokai, gruvbox
Default: dracula
Description: Color theme for the TUI interface
Example:
theme: nordType: Number
Default: 12
Description: Font size for terminal rendering (if supported)
Example:
font-size: 14Type: String
Valid Values: error, warn, info, debug, trace
Default: info
Description: Verbosity of logging output
Example:
log-level: debugType: Boolean
Default: false
Description: Automatically approve generated code without asking
Security Note: Use with caution; always review generated code
Example:
auto-approve: falseType: Number (seconds)
Default: 300
Description: Default timeout for operations
Example:
timeout: 600Type: Object
Description: Control what RiceCoder can do
Valid Permissions:
-
file-write: Write files (allow,ask,deny) -
file-delete: Delete files (allow,ask,deny) -
shell-execute: Execute shell commands (allow,ask,deny) -
network-access: Access network (allow,ask,deny)
Default: All set to ask
Example:
permissions:
file-write: ask
file-delete: deny
shell-execute: ask
network-access: allowType: Object
Description: Define custom shell commands for use in chat
Example:
commands:
test:
command: cargo test
description: Run tests
build:
command: cargo build --release
description: Build release binary
fmt:
command: cargo fmt
description: Format codeUsage in Chat:
r[ > /testRiceCoder supports environment variables for sensitive configuration and overrides.
Description: Override the configured provider
Example:
export RICECODER_PROVIDER=ollama
rice chatDescription: Override the configured model
Example:
export RICECODER_MODEL=gpt-3.5-turbo
rice chatDescription: OpenAI API key (used if api-key not in config)
Example:
export OPENAI_API_KEY=sk-...Description: Anthropic API key
Example:
export ANTHROPIC_API_KEY=sk-ant-...Description: GitHub token for GitHub Copilot
Example:
export GITHUB_TOKEN=ghp_...Description: Override the default RiceCoder home directory
Default: ~/.ricecoder
Example:
export RICECODER_HOME=/custom/pathDescription: Override the configuration file location
Default: ~/.ricecoder/config.yaml
Example:
export RICECODER_CONFIG=/path/to/config.yamlDescription: Override the log level
Valid Values: error, warn, info, debug, trace
Example:
export RICECODER_LOG_LEVEL=debugDescription: Override the default timeout (in seconds)
Example:
export RICECODER_TIMEOUT=600For quick setup with OpenAI:
# ~/.ricecoder/config.yaml
provider: openai
api-key: ${OPENAI_API_KEY}
model: gpt-4For offline development with local models:
# ~/.ricecoder/config.yaml
provider: ollama
ollama-url: http://localhost:11434
model: mistral
log-level: debugFor team environments with strict permissions:
# .agent/config.yaml
provider: anthropic
api-key: ${ANTHROPIC_API_KEY}
model: claude-3-opus
permissions:
file-write: ask
file-delete: deny
shell-execute: ask
network-access: allow
log-level: info
timeout: 600
commands:
test:
command: cargo test
description: Run tests
lint:
command: cargo clippy
description: Run linterSwitch between providers for different tasks:
# ~/.ricecoder/config.yaml
provider: openai
model: gpt-4
# Project-specific override for local development
# .agent/config.yaml
provider: ollama
model: mistralThen switch at runtime:
# Use project config (Ollama)
rice chat
# Override with environment variable
RICECODER_PROVIDER=openai rice chatFor faster responses with cost optimization:
# ~/.ricecoder/config.yaml
provider: openai
model: gpt-3.5-turbo
timeout: 60
log-level: warnrice config set provider openai
rice config set model gpt-4
rice config set api-key sk-...rice config get provider
rice config get modelrice config showrice config resetEdit ~/.ricecoder/config.yaml or .agent/config.yaml with your text editor:
# Edit global config
nano ~/.ricecoder/config.yaml
# Edit project config
nano .agent/config.yamlOverride configuration at runtime:
export OPENAI_API_KEY=sk-...
export RICECODER_PROVIDER=openai
export RICECODER_MODEL=gpt-4
rice chatProblem: RiceCoder can't find your configuration file
Solution:
- Check if
~/.ricecoder/config.yamlexists:
ls -la ~/.ricecoder/config.yaml- If not, initialize RiceCoder:
rice init- Or specify the config file:
export RICECODER_CONFIG=/path/to/config.yamlProblem: RiceCoder can't find your API key
Solution:
- Check if API key is set in config:
rice config get api-key- If not, set it:
rice config set api-key sk-...- Or use environment variable:
export OPENAI_API_KEY=sk-...Problem: Configuration file has syntax errors
Solution:
- Check YAML syntax (indentation, quotes, etc.)
- Validate with a YAML validator: https://www.yamllint.com/
- Check for common issues:
- Incorrect indentation (use spaces, not tabs)
- Missing quotes around strings with special characters
- Incorrect boolean values (use
true/false, notyes/no)
Problem: Changes to configuration don't seem to apply
Solution:
- Check configuration hierarchy - project config overrides user config
- Check environment variables - they override file configuration
- Verify the correct file is being edited:
rice config show # Shows which config is active- Restart RiceCoder after making changes
Problem: RiceCoder is blocked from performing an action
Solution:
- Check permissions configuration:
rice config get permissions- Update permissions:
permissions:
file-write: allow
file-delete: ask
shell-execute: ask- Or use CLI:
rice config set permissions.file-write allowProblem: Can't connect to Ollama
Solution:
- Check if Ollama is running:
curl http://localhost:11434/api/tags- If not, start Ollama:
ollama serve- Check the URL in configuration:
rice config get ollama-url- If using a different URL, update it:
rice config set ollama-url http://your-ollama-url:11434Problem: The specified model doesn't exist
Solution:
- Check available models:
# For Ollama
ollama list
# For OpenAI
rice config show # Check model name- Pull the model (Ollama):
ollama pull mistral- Update configuration:
rice config set model mistralNever commit API keys to version control:
# ✓ Good
api-key: ${OPENAI_API_KEY}
# ✗ Bad
api-key: sk-...Store team-specific settings in .agent/config.yaml:
# .agent/config.yaml
provider: anthropic
permissions:
file-delete: denyAdd descriptions to custom commands:
commands:
test:
command: cargo test
description: Run all tests
lint:
command: cargo clippy
description: Run linter and check for warnings-
error: Only errors -
warn: Errors and warnings -
info: General information (default) -
debug: Detailed debugging information -
trace: Very detailed tracing
Adjust timeouts based on your network and model:
timeout: 300 # 5 minutes for complex operationsBe explicit about what RiceCoder can do:
permissions:
file-write: ask # Ask before writing files
file-delete: deny # Never delete files
shell-execute: ask # Ask before running commands- Quick Start Guide - Get started quickly
- CLI Commands - All available commands
- AI Providers Guide - Provider setup and comparison
- Local Models Guide - Using Ollama for local models
- Troubleshooting Guide - Common issues and solutions
Last updated: December 3, 2025