diff --git a/.github/workflows/release-auto-tag.yml b/.github/workflows/release-auto-tag.yml index 97efa81d..569c3338 100644 --- a/.github/workflows/release-auto-tag.yml +++ b/.github/workflows/release-auto-tag.yml @@ -6,7 +6,7 @@ name: Release Auto-Tag on: workflow_dispatch: {} schedule: - - cron: "0 14 * * *" # 7 AM PDT + - cron: "0 14 * * 1-5" # 7 AM PDT, weekdays only permissions: contents: write diff --git a/README.md b/README.md index 1800a468..88ef9117 100644 --- a/README.md +++ b/README.md @@ -14,7 +14,6 @@ OpenShell is built agent-first. The project ships with agent skills for everythi ## Quickstart - ### Prerequisites - **Docker** — Docker Desktop (or a Docker daemon) must be running. @@ -33,6 +32,8 @@ curl -LsSf https://raw.githubusercontent.com/NVIDIA/OpenShell/main/install.sh | uv tool install -U openshell ``` +Both methods install the latest stable release by default. To install a specific version, set `OPENSHELL_VERSION` (binary) or pin the version with `uv tool install openshell==`. A [`dev` release](https://github.com/NVIDIA/OpenShell/releases/tag/dev) is also available that tracks the latest commit on `main`. + ### Create a sandbox ```bash @@ -117,7 +118,9 @@ Policies are declarative YAML files. Static sections (filesystem, process) are l Agents need credentials — API keys, tokens, service accounts. OpenShell manages these as **providers**: named credential bundles that are injected into sandboxes at creation. The CLI auto-discovers credentials for recognized agents (Claude, Codex, OpenCode) from your shell environment, or you can create providers explicitly with `openshell provider create`. Credentials never leak into the sandbox filesystem; they are injected as environment variables at runtime. -## GPU Support +## GPU Support (Experimental) + +> **Experimental** — GPU passthrough works on supported hosts but is under active development. Expect rough edges and breaking changes. OpenShell can pass host GPUs into sandboxes for local inference, fine-tuning, or any GPU workload. Add `--gpu` when creating a sandbox: @@ -137,7 +140,7 @@ The CLI auto-bootstraps a GPU-enabled gateway on first use. GPU intent is also i | [OpenCode](https://opencode.ai/) | [`base`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/base) | Works out of the box. Provider uses `OPENAI_API_KEY` or `OPENROUTER_API_KEY`. | | [Codex](https://developers.openai.com/codex) | [`base`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/base) | Works out of the box. Provider uses `OPENAI_API_KEY`. | | [OpenClaw](https://openclaw.ai/) | [Community](https://github.com/NVIDIA/OpenShell-Community) | Launch with `openshell sandbox create --from openclaw`. | -| [Ollama](https://ollama.com/) | [Community](https://github.com/NVIDIA/OpenShell-Community) | Launch with `openshell sandbox create --from ollama`. | +| [Ollama](https://ollama.com/) | [Community](https://github.com/NVIDIA/OpenShell-Community) | Launch with `openshell sandbox create --from ollama`. | ## Key Commands