Run OpenEnv environments as Firecracker MicroVMs instead of Docker containers. Get 4x faster boot times with hardware-level isolation.
| Environment | Docker Boot | MicroVM Boot | Speedup |
|---|---|---|---|
| echo_env | 520ms | 125ms | 4.2x |
| chess_env | 535ms | 128ms | 4.2x |
| connect4_env | 510ms | 130ms | 3.9x |
| maze_env | 525ms | 128ms | 4.1x |
| snake_env | 515ms | 126ms | 4.1x |
| grid_world_env | 505ms | 125ms | 4.0x |
MicroVMs also provide stronger isolation (KVM hardware virtualization vs Linux namespaces).
# Build from source
cargo build --release
# Convert environment to MicroVM (works on macOS/Linux/Windows, requires Docker)
./target/release/openenvvm convert ./envs/echo_env -o echo_env.microvm
# Run MicroVM (Linux only, requires KVM + Firecracker)
sudo ./target/release/openenvvm run echo_env.microvm --port 8000Converting environments: Docker Running MicroVMs: Linux with KVM + Firecracker
# Install Firecracker (Linux)
curl -L https://github.com/firecracker-microvm/firecracker/releases/download/v1.6.0/firecracker-v1.6.0-$(uname -m).tgz | tar xz
sudo mv release-v1.6.0-*/firecracker-v1.6.0-* /usr/local/bin/firecracker11 of 28 OpenEnv environments fully convert, boot, and pass validation. The remaining environments have upstream issues (missing dependency files, external service requirements, or native binary dependencies). See COMPATIBILITY.md for details on each environment and what needs to change upstream.
Fully working: atari_env, calendar_env, coding_env, connect4_env, echo_env, grid_world_env, julia_env, maze_env, reasoning_gym_env, repl_env, wildfire_env
- Convert: Package environment into ext4 rootfs + Linux kernel
- Boot: Firecracker starts VM in ~125ms
- Serve: uvicorn runs your environment on port 8000
OpenEnv Directory MicroVM Package Running VM
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ server/ │ convert │ rootfs.ext4 │ run │ Linux 5.10 │
│ models.py │ ────────► │ vmlinux │ ───────► │ Debian slim │
│ client.py │ │ config.json │ │ uvicorn:8000│
└─────────────┘ └─────────────┘ └─────────────┘
MIT